 
            
              Saro
            
            @pas_saro
Followers
                481
              Following
                142
              Media
                4
              Statuses
                42
              AI4Science @MIT • Former Maths @Cambridge_Uni, @AIatMeta, @GRESEARCHjobs
              
              London, England
            
            
              
              Joined May 2020
            
            
           Proud to share our work on Boltz-2, the first AI model to approach FEP-level accuracy for binding affinity prediction, while being 1000x faster 🚀 
           Excited to unveil Boltz-2, our new model capable not only of predicting structures but also binding affinities! Boltz-2 is the first AI model to approach the performance of FEP simulations while being more than 1000x faster! All open-sourced under MIT license! A thread… 🤗🚀 
            
                
                0
              
              
                
                2
              
              
                
                21
              
            
            @proteinbase Note: we committed to publishing all results on Proteinbase when we submitted them, before we saw the results!
          
          
                
                1
              
              
                
                3
              
              
                
                21
              
             Pioneering work from @hannesStaerk and team on protein binder design. Honored for our group at @IOCBBoston to have made a small contribution. Congratulations to all! 
           Excited to release BoltzGen which brings SOTA folding performance to binder design! The best part of this project has been collaborating with many leading biologists who tested BoltzGen at an unprecedented scale, showing success on many novel targets and pushing its limits! 🧵.. 
            
                
                1
              
              
                
                9
              
              
                
                59
              
             Open source takes the crown again. congrats @HannesStaerk and the incredibly cracked Boltz team! 
           Excited to release BoltzGen which brings SOTA folding performance to binder design! The best part of this project has been collaborating with many leading biologists who tested BoltzGen at an unprecedented scale, showing success on many novel targets and pushing its limits! 🧵.. 
            
                
                0
              
              
                
                4
              
              
                
                30
              
             Epic. Another big step toward universal binder design. Produced nanomolar binders for 6/9 targets with no known binders and low (<30%) sequence similarity to any structural complex in the PBD. The Boltz team is unreal. All open-source and in the hands of scientists around the 
           We go after targets that require generalization. E.g. we tested 15 nanobodies against each of 9 targets selected for their dissimilarity to any protein with an existing bound structure. For 6 of 9 targets we obtain nM binders. The same 67% success rate holds for miniproteins 🤗 
            
                
                5
              
              
                
                26
              
              
                
                231
              
             Thrilled to finally see BoltzGen, our new state-of-the-art all-atom binder design model, coming out fully open-source after a very extensive experimental validation with many top academic and industry labs! 🧬 The diversity of the experiments is unprecedented, spanning binder 
          
                
                4
              
              
                
                79
              
              
                
                354
              
             Thrilled to see BoltzGen out — our state-of-the-art universal binder design model. We stress-tested it on 25+ targets and found unprecedented generalization. Particularly excited by the 67% hit rate for nanobody designs on the hardest targets we could find in the PDB! 
           Excited to release BoltzGen which brings SOTA folding performance to binder design! The best part of this project has been collaborating with many leading biologists who tested BoltzGen at an unprecedented scale, showing success on many novel targets and pushing its limits! 🧵.. 
            
                
                1
              
              
                
                3
              
              
                
                34
              
             Excited to release BoltzGen which brings SOTA folding performance to binder design! The best part of this project has been collaborating with many leading biologists who tested BoltzGen at an unprecedented scale, showing success on many novel targets and pushing its limits! 🧵.. 
          
                
                19
              
              
                
                261
              
              
                
                947
              
             📢 Call for proposals: Boltz small-molecule design collaboration! 🧬 Can we help design your ideal molecule? Can you help us improve our open-source models? Please reach out or share with scientists you know! More details below! It has been great to see the level of excitement 
          
                
                4
              
              
                
                69
              
              
                
                261
              
             Wow apparently Boltz-2 was used as example by no other than John Jumper ☺️🤗 
           4. John isn't convinced that MD is the next frontier. MD is just another model so you're limiting yourself to its scope Instead, train to solve a "harder problem" and then apply it to other domains His example was how Boltz-2 (@GabriCorso) predicts affinity from structure 
          
                
                0
              
              
                
                5
              
              
                
                63
              
             For those already using Boltz-2 affinity prediction, we realized that there was a bit of confusion around the different outputs from the models and in what contexts each should be used. We've added more details in the docs. A summary below. 
          
                
                1
              
              
                
                13
              
              
                
                147
              
             The team at @NVIDIA has done such amazing work accelerating Boltz-2 through novel CUDA kernels and deploying Boltz-2 as NVIDIA’s NIM! The kernels are live on the Boltz repo, and you can run the model with 2x training & inference speedup and large memory savings!🧵#cuEquivariance
          
          
                
                4
              
              
                
                16
              
              
                
                102
              
             A team led by Regina Barzilay, a computer science professor at @MIT, has launched Boltz-2, an algorithm that unites protein folding and prediction of small-molecule binding affinity in one package. 
          
            
            cen.acs.org
              Freely available Boltz-2 algorithm can predict small-molecule binding affinities
            
                
                0
              
              
                
                8
              
              
                
                17
              
             Absolutely thrilled to announce the availability of cuEquivariance v0.5 and our contributions to Boltz-2! cuEquivariance v0.5 is a huge release -- now including accelerated triangle attention and multiplication kernels, fundamental to performance of next-gen geometry-aware NNs. 
          
                
                2
              
              
                
                17
              
              
                
                124
              
             Did you know the NVIDIA #cuEquivariance library can now accelerate Triangle Attention and Triangle Multiplication operations? Say goodbye to AI model bottlenecks — get up to 5x speedups in training and inference to build and train bigger models. We’re excited that the next-gen 
          
                
                4
              
              
                
                38
              
              
                
                172
              
             Hailing from quantum chemistry and physics-based modeling, this feels like a seminal moment. And one that I would not have predicted 10 years ago. Kudos! and Kudos for being on the side of openness. 
           Excited to unveil Boltz-2, our new model capable not only of predicting structures but also binding affinities! Boltz-2 is the first AI model to approach the performance of FEP simulations while being more than 1000x faster! All open-sourced under MIT license! A thread… 🤗🚀 
            
                
                1
              
              
                
                2
              
              
                
                19
              
             Beautiful work! Weights on HF here: 
          
            
            huggingface.co
             Excited to unveil Boltz-2, our new model capable not only of predicting structures but also binding affinities! Boltz-2 is the first AI model to approach the performance of FEP simulations while being more than 1000x faster! All open-sourced under MIT license! A thread… 🤗🚀 
            
                
                4
              
              
                
                5
              
              
                
                35
              
             Scalable computational binding affinity prediction is a crucial and long-standing scientific challenge. Physics-based methods like FEP are accurate but slow and expensive. Docking is fast but noisy. Deep learning models haven’t matched the reliability of FEP—until now. 
          
                
                1
              
              
                
                4
              
              
                
                49
              
             Excited to unveil Boltz-2, our new model capable not only of predicting structures but also binding affinities! Boltz-2 is the first AI model to approach the performance of FEP simulations while being more than 1000x faster! All open-sourced under MIT license! A thread… 🤗🚀 
          
                
                49
              
              
                
                415
              
              
                
                2K
              
             
               
             
               
             
             
               
             
             
             
             
             
            