Joey Bose Profile
Joey Bose

@bose_joey

Followers
4K
Following
10K
Media
85
Statuses
2K

Assistant Professor @imperialcollege and @Mila_Quebec Affiliate member. Into Geometry ⋃ Generative Models ⋃ AI4Science. Ex-@UniofOxford, @Mila_Quebec, @UofT.

London
Joined January 2018
Don't wanna be here? Send us removal request.
@bose_joey
Joey Bose
4 months
🎉Personal update: I'm thrilled to announce that I'm joining Imperial College London @imperialcollege as an Assistant Professor of Computing @ICComputing starting January 2026. My future lab and I will continue to work on building better Generative Models 🤖, the hardest
97
34
608
@bose_joey
Joey Bose
1 day
Flow Maps are all the rage these days. In this new work we generalize them to Riemannian manifolds. SOTA results 🚀, few step inference 🔥, and generalizes many few step generative models on manifolds in one slick framework 😎. Great work with even greater collaborators
@osclsd
Oscar Davis
1 day
Introducing Generalised Flow Maps 🎉 A stable, few-step generative model on Riemannian manifolds 🪩 📚 Read it at: https://t.co/iCTHedwCxf 💾 Code: https://t.co/MeukcthFN2 @msalbergo @nmboffi @mmbronstein @bose_joey
1
8
82
@nmboffi
Nicholas Boffi
3 days
@huijiezh really nice work! i really like the \alpha-flow idea! your approach seems quite similar to our recent framework for flow maps, which you may also find interesting ( https://t.co/QBp1kELVhF) - see also the pinned post on my page, which includes very similar diagrams to yours here.
Tweet card summary image
arxiv.org
Flow-based generative models achieve state-of-the-art sample quality, but require the expensive solution of a differential equation at inference time. Flow map models, commonly known as...
1
3
21
@bose_joey
Joey Bose
3 days
This is as close to a religious text that I would read. So comprehensive and well done!
@JCJesseLai
Chieh-Hsin (Jesse) Lai
3 days
Tired to go back to the original papers again and again? Our monograph: a systematic and fundamental recipe you can rely on! 📘 We’re excited to release 《The Principles of Diffusion Models》— with @DrYangSong, @gimdong58085414, @mittu1204, and @StefanoErmon. It traces the core
1
3
85
@JCJesseLai
Chieh-Hsin (Jesse) Lai
3 days
Tired to go back to the original papers again and again? Our monograph: a systematic and fundamental recipe you can rely on! 📘 We’re excited to release 《The Principles of Diffusion Models》— with @DrYangSong, @gimdong58085414, @mittu1204, and @StefanoErmon. It traces the core
41
401
2K
@genesistxai
Genesis Molecular AI
3 days
Excited to share Pearl from Genesis Molecular AI (yes, we've updated our name!): the first co-folding model to clearly surpass AlphaFold 3 on protein-ligand structure prediction. Unlike LLMs that train on vast public data, drug discovery AI faces fundamental data scarcity. Our
1
17
38
@fedzbar
Federico Barbero
9 days
🚨🌶️ Did you realise you can get alignment `training’ data out of open weights models? Oops We show that models will regurgitate alignment data that is (semantically) memorised. This data can come from SFT and RL... and can be used to train your own models! 🧵
10
40
238
@erikjbekkers
Erik Bekkers
12 days
As promised after our great discussion, @chaitanyakjoshi! Your inspiring post led to our formal rejoinder: the Platonic Transformer. What if the "Equivariance vs. Scale" debate is a false premise? Our paper shows you can have both. 📄 Preprint: https://t.co/kd8MFiOmuG 1/9
@chaitjo
Chaitanya K. Joshi
5 months
After a long hiatus, I've started blogging again! My first post was a difficult one to write, because I don't want to keep repeating what's already in papers. I tried to give some nuanced and (hopefully) fresh takes on equivariance and geometry in molecular modelling.
1
28
93
@osclsd
Oscar Davis
15 days
As the IMM paper came out in March, I implemented it myself for some project, before the true source code was made available. I am releasing my version now: https://t.co/VXOBw91GXk It contains most/all features, and should be easy to (re-)use! Hope someone finds it helpful 🙂
Tweet card summary image
github.com
Non-official Inductive Moment Matching implementation in PyTorch with Lightning. Clean and simple. - olsdavis/imm
1
1
9
@tsirigoc
Christos Tsirigotis
24 days
Excited to present our work on dense retrieval at COLM 2025! Enter BiXSE: Improving Dense Retrieval via Probabilistic Graded Relevance Distillation! We show how to train with a simple, point-wise, binary cross-entropy loss on LLM-graded data and outperform InfoNCE!
2
8
28
@bose_joey
Joey Bose
24 days
Really cool work that unifies many threads on one step generative models. This is now my go to model family.
@nmboffi
Nicholas Boffi
24 days
Consistency models, CTMs, shortcut models, align your flow, mean flow... What's the connection, and how should you learn them in practice? We show they're all different sides of the same coin connected by one central object: the flow map. https://t.co/QBp1kELVhF 🧵(1/n)
0
0
28
@majdi_has
Majdi Hassan
28 days
(1/7) New paper!🚀 https://t.co/dq6yEzWyHg ✅Boltzmann distribution sampling for peptides up to 8 residues ✅4.3ms of training MD trajectories ✅Open-source codebase With @charliebtan, @leonklein26, Saifuddin Syed, @dom_beaini @mmbronstein @AlexanderTong7 @k_neklyudov Read
8
51
211
@bose_joey
Joey Bose
1 month
🔉 New paper on training better Diffusion Language Models that plan at inference time! Great work led by @pengzhangzhi1 and Zack B.!!
@pengzhangzhi1
Fred Zhangzhi Peng
1 month
🚨 New paper! We introduce a planner-aware training tweak to diffusion language models. ⚡ One-line-of-code change to the loss 💡 Fixes training–inference mismatch 📈 Strong gains in protein, text, and code generation https://t.co/RWy9GaX8G2 (1/n)
1
1
21
@PetarV_93
Petar Veličković
1 month
The @EEMLcommunity is coming to Podgorica 🇲🇪 on 8 November! Mark your calendars 🚀 Beyond excited to share that we're organising the Montenegrin ML Workshop (MMLW'25), part of EEML Workshop Series, together with @aisocietyme ❤️ (Free) registration required -- please see below!
1
6
7
@Schmidt_Center
Eric and Wendy Schmidt Center
1 month
🎉 Congrats to @Schmidt_Center postdoctoral fellow @lazar_atan and colleagues on their paper acceptance to @NeurIPSConf 2025! CurlyFM introduces Curly Flow Matching, a new way to model non-gradient field dynamics, capturing complex, periodic behaviors missed by current methods.
@kpetrovvic
Katarina Petrovic
1 month
🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎
0
5
15
@lazar_atan
Lazar Atanackovic
1 month
Curly-FM accepted to #NeurIPS2025! 🌊 See everyone in San Diego! 🌊
@kpetrovvic
Katarina Petrovic
1 month
🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎
0
2
18
@bose_joey
Joey Bose
1 month
🌊Now coming to a #NeurIPS2025 near you.
@kpetrovvic
Katarina Petrovic
1 month
🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎
0
1
66
@fchollet
François Chollet
2 months
The most important skill for a researcher is not technical ability. It's taste. The ability to identify interesting and tractable problems, and recognize important ideas when they show up. This can't be taught directly. It's cultivated through curiosity and broad reading.
101
571
4K
@bose_joey
Joey Bose
1 month
🎉3 papers including 1 spotlight to #NeurIPS2025 . Congrats to all my co-authors 👏 Sadly 2 very good papers didn't make it this time (including 1 best paper at a workshop). We will fight for 🇧🇷 On a silly but personal note, paper acceptance streak ended at 21 😭.
7
5
162
@Qu3ntinB
Quentin Bertrand
1 month
I am thrilled to announce that our work on the generalization of flow matching has been accepted to NeurIPS as an oral!! See you in San Diego 😎
@mathusmassias
Mathurin Massias
5 months
New paper on the generalization of Flow Matching https://t.co/BJMHUnY6xJ 🤯 Why does flow matching generalize? Did you know that the flow matching target you're trying to learn **can only generate training points**? with @Qu3ntinB, Anne Gagneux & Rémi Emonet 👇👇👇
5
67
588
@_aliemami
Ali Emami
1 month
I’m recruiting fully-funded PhD students (Fall 2026) to join my new group @EmoryCS! 🎓 We’ll work on trustworthy, impactful AI at the intersection of NLP, AI Safety, Human-Centered AI & AI4Health. More details at 👉 https://t.co/q0Td4rDABR (Picture is with some incredible people
0
4
19