 
            
              Joey Bose
            
            @bose_joey
Followers
                4K
              Following
                10K
              Media
                85
              Statuses
                2K
              Assistant Professor @imperialcollege and @Mila_Quebec Affiliate member. Into Geometry ⋃ Generative Models ⋃ AI4Science. Ex-@UniofOxford, @Mila_Quebec, @UofT.
              
              London
            
            
              
              Joined January 2018
            
            
           🎉Personal update: I'm thrilled to announce that I'm joining Imperial College London @imperialcollege as an Assistant Professor of Computing @ICComputing starting January 2026. My future lab and I will continue to work on building better Generative Models 🤖, the hardest 
          
                
                97
              
              
                
                34
              
              
                
                608
              
             Flow Maps are all the rage these days. In this new work we generalize them to Riemannian manifolds. SOTA results 🚀, few step inference 🔥, and generalizes many few step generative models on manifolds in one slick framework 😎. Great work with even greater collaborators 
           Introducing Generalised Flow Maps 🎉 A stable, few-step generative model on Riemannian manifolds 🪩 📚 Read it at:  https://t.co/iCTHedwCxf  💾 Code:  https://t.co/MeukcthFN2 
              @msalbergo @nmboffi @mmbronstein @bose_joey
            
            
                
                1
              
              
                
                8
              
              
                
                82
              
            
            @huijiezh really nice work! i really like the \alpha-flow idea! your approach seems quite similar to our recent framework for flow maps, which you may also find interesting (  https://t.co/QBp1kELVhF)  - see also the pinned post on my page, which includes very similar diagrams to yours here.
          
          
            
            arxiv.org
              Flow-based generative models achieve state-of-the-art sample quality, but require the expensive solution of a differential equation at inference time. Flow map models, commonly known as...
            
                
                1
              
              
                
                3
              
              
                
                21
              
             This is as close to a religious text that I would read. So comprehensive and well done! 
           Tired to go back to the original papers again and again? Our monograph: a systematic and fundamental recipe you can rely on! 📘 We’re excited to release 《The Principles of Diffusion Models》— with @DrYangSong, @gimdong58085414, @mittu1204, and @StefanoErmon. It traces the core 
            
                
                1
              
              
                
                3
              
              
                
                85
              
             Tired to go back to the original papers again and again? Our monograph: a systematic and fundamental recipe you can rely on! 📘 We’re excited to release 《The Principles of Diffusion Models》— with @DrYangSong, @gimdong58085414, @mittu1204, and @StefanoErmon. It traces the core 
          
                
                41
              
              
                
                401
              
              
                
                2K
              
             Excited to share Pearl from Genesis Molecular AI (yes, we've updated our name!): the first co-folding model to clearly surpass AlphaFold 3 on protein-ligand structure prediction. Unlike LLMs that train on vast public data, drug discovery AI faces fundamental data scarcity. Our 
          
                
                1
              
              
                
                17
              
              
                
                38
              
             🚨🌶️ Did you realise you can get alignment `training’ data out of open weights models? Oops We show that models will regurgitate alignment data that is (semantically) memorised. This data can come from SFT and RL... and can be used to train your own models! 🧵 
          
                
                10
              
              
                
                40
              
              
                
                238
              
             As promised after our great discussion, @chaitanyakjoshi! Your inspiring post led to our formal rejoinder: the Platonic Transformer. What if the "Equivariance vs. Scale" debate is a false premise? Our paper shows you can have both. 📄 Preprint:  https://t.co/kd8MFiOmuG  1/9 
           After a long hiatus, I've started blogging again! My first post was a difficult one to write, because I don't want to keep repeating what's already in papers. I tried to give some nuanced and (hopefully) fresh takes on equivariance and geometry in molecular modelling. 
            
                
                1
              
              
                
                28
              
              
                
                93
              
             As the IMM paper came out in March, I implemented it myself for some project, before the true source code was made available. I am releasing my version now:  https://t.co/VXOBw91GXk  It contains most/all features, and should be easy to (re-)use! Hope someone finds it helpful 🙂 
          
            
            github.com
              Non-official Inductive Moment Matching implementation in PyTorch with Lightning. Clean and simple. - olsdavis/imm
            
                
                1
              
              
                
                1
              
              
                
                9
              
             Excited to present our work on dense retrieval at COLM 2025! Enter BiXSE: Improving Dense Retrieval via Probabilistic Graded Relevance Distillation! We show how to train with a simple, point-wise, binary cross-entropy loss on LLM-graded data and outperform InfoNCE! 
          
                
                2
              
              
                
                8
              
              
                
                28
              
             Really cool work that unifies many threads on one step generative models. This is now my go to model family. 
           Consistency models, CTMs, shortcut models, align your flow, mean flow... What's the connection, and how should you learn them in practice? We show they're all different sides of the same coin connected by one central object: the flow map.  https://t.co/QBp1kELVhF  🧵(1/n) 
            
                
                0
              
              
                
                0
              
              
                
                28
              
             (1/7) New paper!🚀  https://t.co/dq6yEzWyHg  ✅Boltzmann distribution sampling for peptides up to 8 residues ✅4.3ms of training MD trajectories ✅Open-source codebase With @charliebtan, @leonklein26, Saifuddin Syed, @dom_beaini
            @mmbronstein @AlexanderTong7 @k_neklyudov Read
          
          
                
                8
              
              
                
                51
              
              
                
                211
              
             🔉 New paper on training better Diffusion Language Models that plan at inference time! Great work led by @pengzhangzhi1 and Zack B.!! 
           🚨 New paper! We introduce a planner-aware training tweak to diffusion language models. ⚡ One-line-of-code change to the loss 💡 Fixes training–inference mismatch 📈 Strong gains in protein, text, and code generation  https://t.co/RWy9GaX8G2  (1/n) 
            
                
                1
              
              
                
                1
              
              
                
                21
              
             The @EEMLcommunity is coming to Podgorica 🇲🇪 on 8 November! Mark your calendars 🚀 Beyond excited to share that we're organising the Montenegrin ML Workshop (MMLW'25), part of EEML Workshop Series, together with @aisocietyme ❤️ (Free) registration required -- please see below! 
          
                
                1
              
              
                
                6
              
              
                
                7
              
             🎉 Congrats to @Schmidt_Center postdoctoral fellow @lazar_atan and colleagues on their paper acceptance to @NeurIPSConf 2025! CurlyFM introduces Curly Flow Matching, a new way to model non-gradient field dynamics, capturing complex, periodic behaviors missed by current methods. 
           🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎 
            
                
                0
              
              
                
                5
              
              
                
                15
              
             Curly-FM accepted to #NeurIPS2025! 🌊 See everyone in San Diego! 🌊 
           🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎 
            
                
                0
              
              
                
                2
              
              
                
                18
              
             🌊Now coming to a #NeurIPS2025 near you. 
           🚀Curly Flow Matching has been accepted to @NeurIPSConf 2025! Massive shot out to my awesome collaborators @lazar_atan @viggomoro @KKapusniak1 @ismaililkanc @mmbronstein @bose_joey @AlexanderTong7 Stay tuned for the camera-ready version+code soon 📸 See you in San Diego! 😎 
            
                
                0
              
              
                
                1
              
              
                
                66
              
             The most important skill for a researcher is not technical ability. It's taste. The ability to identify interesting and tractable problems, and recognize important ideas when they show up. This can't be taught directly. It's cultivated through curiosity and broad reading. 
          
                
                101
              
              
                
                571
              
              
                
                4K
              
             🎉3 papers including 1 spotlight to #NeurIPS2025 . Congrats to all my co-authors 👏 Sadly 2 very good papers didn't make it this time (including 1 best paper at a workshop). We will fight for 🇧🇷 On a silly but personal note, paper acceptance streak ended at 21 😭. 
          
                
                7
              
              
                
                5
              
              
                
                162
              
             I am thrilled to announce that our work on the generalization of flow matching has been accepted to NeurIPS as an oral!! See you in San Diego 😎 
           New paper on the generalization of Flow Matching  https://t.co/BJMHUnY6xJ  🤯 Why does flow matching generalize? Did you know that the flow matching target you're trying to learn **can only generate training points**? with @Qu3ntinB, Anne Gagneux & Rémi Emonet 👇👇👇 
            
                
                5
              
              
                
                67
              
              
                
                588
              
             I’m recruiting fully-funded PhD students (Fall 2026) to join my new group @EmoryCS! 🎓 We’ll work on trustworthy, impactful AI at the intersection of NLP, AI Safety, Human-Centered AI & AI4Health. More details at 👉  https://t.co/q0Td4rDABR  (Picture is with some incredible people 
          
                
                0
              
              
                
                4
              
              
                
                19
              
             
               
             
               
             
             
             
               
             
             
               
             
             
               
             
             
             
              