
Daniel Severo
@_dsevero
Followers
2K
Following
7K
Media
186
Statuses
4K
Research Engineer FAIR @AIatMeta / PhD @UofT @VectorInst / Former @GoogleAI student researcher and @UFSC 🇧🇷 undergrad
Montréal, Québec
Joined March 2011
Excited to announce our new work on compressing vector database indices! Highlights: 🗜️ 7x compression of FAISS indices with no impact on accuracy or search runtime 🗜️ 30% memory reduction on billion-scale datasets https://t.co/JU9PoLG4Mh
https://t.co/YgbQ2j1jzj
0
7
46
Excited to share our work Set Block Decoding! A new paradigm combining next-token-prediction and masked (or discrete diffusion) models, allowing parallel decoding without any architectural changes and with exact KV cache. Arguably one of the simplest ways to accelerate LLMs!
3
25
112
Me: "Where did you go to school?" Person: "Depends, undergrad or PhD?" ... are you only allowed to tell me one of them?
0
0
4
The news is out! We're starting Blank Bio to build a computational toolkit assisted with RNA foundation models. If you want to see my flip between being eerily still and overly animated check out the video below! The core hypothesis is that RNA is the most customizable molecule
Blank Bio (@blankbio_) is building foundation models to power a computational toolkit for RNA therapeutics, starting with mRNA design and expanding to target ID, biomarker discovery, and more. https://t.co/7VRxSRgSKK Congrats on the launch, @hsu_jonny, @phil_fradkin & @ianshi3!
13
25
178
Why are we hiding updated scores during NeurIPS rebuttals. What is this trying to fix lol
0
0
32
The supervision signal in AI4Science is so crisp that we can solve very complicated problems almost without any data or RL! In this project, we train a model to solve the Schrödinger equation for different molecular conformations using Density Functional Theory (DFT) In the
(1/n)🚨You can train a model solving DFT for any geometry almost without training data!🚨 Introducing Self-Refining Training for Amortized Density Functional Theory — a variational framework for learning a DFT solver that predicts the ground-state solutions for different
0
21
91
(1/n)🚨You can train a model solving DFT for any geometry almost without training data!🚨 Introducing Self-Refining Training for Amortized Density Functional Theory — a variational framework for learning a DFT solver that predicts the ground-state solutions for different
3
39
156
New work: a scalable way to learn dists over permutations/rankings. The method can trade-off compute and expressivity by varying # NFEs (ie unmasking more than one token at a time), and subsumes well known families of models (eg Mallow' model) https://t.co/mEj3hXpBoL
1
3
13
We've open sourced Adjoint Sampling! It's part of a bundled release showcasing FAIR's research and open source commitment to AI for science. https://t.co/6oBTnael8p
https://t.co/rYmJ02KguC
github.com
code for "Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching" - facebookresearch/adjoint_sampling
Announcing the newest releases from Meta FAIR. We’re releasing new groundbreaking models, benchmarks, and datasets that will transform the way researchers approach molecular property prediction, language processing, and neuroscience. 1️⃣ Open Molecules 2025 (OMol25): A dataset
1
23
116
Announcing the newest releases from Meta FAIR. We’re releasing new groundbreaking models, benchmarks, and datasets that will transform the way researchers approach molecular property prediction, language processing, and neuroscience. 1️⃣ Open Molecules 2025 (OMol25): A dataset
69
319
2K
Hard to describe how incredible Montreal is between May and November. Total vibe.
0
0
7
Against conventional wisdom, I will be giving a talk with particular focus on the "how" and the various intricacies of applying stochastic control for generative modeling. Mon 9:50am Hall 1 Apex #ICLR2025 Also check out the other talks at https://t.co/7e2rJUmfIV!
9
33
261
📣I'll be at the poster session with our follow-up on Discrete Flow Matching. We derive a closed-form solution to the kinetic optimal problem for conditional velocity on discrete spaces. Into flow models? come chat! 💬 🗓Poster: Sat 10am (#191), 🎤Oral: Sat 3:30pm (6E) #ICLR2025
Discrete Flow Matching extends the Flow Matching recipe to discrete data. But so far the focus of the community has been on the simple masking corruption process. We now enable general corruption processes. Imagination is the limit! Oral by @shaulneta Sat 3:30pm.
1
8
36
Discrete Flow Matching extends the Flow Matching recipe to discrete data. But so far the focus of the community has been on the simple masking corruption process. We now enable general corruption processes. Imagination is the limit! Oral by @shaulneta Sat 3:30pm.
1
2
13
We are presenting 3 orals and 1 spotlight at #ICLR2025 on two primary topics: On generalizing the data-driven flow matching algorithm to jump processes, arbitrary discrete corruption processes, and beyond. And on highly scalable algorithms for reward-driven learning settings.
1
28
231
.@shenghao_yang's Ph.D Thesis "Perspectives of Graph Diffusion: Computation, Local Partitioning, Statistical Recovery, and Applications" is now available. Link: https://t.co/UqqCtIJN9C Relevant papers: 1) p-Norm Flow Diffusion for Local Graph Clustering: https://t.co/NJZ4KhOla6
.@shenghao_yang passed his PhD defence today. Shenghao is the second PhD student to graduate from our group. I am very happy for Shenghao and the work that he has done! I would also like to thank the members of the committee: Stephen Vavasis, Yaoliang Yu, Lap Chi Lau and Satish
0
3
14
BREAKING: Amii Chief Scientific Advisor, Richard S. Sutton, has been awarded the A.M. Turing Award, the highest honour in computer science, alongside Andrew Barto! Read the official @TheOfficialACM announcement: https://t.co/JXDhdEsQv7
#TuringAward #AI #ReinforcementLearning
5
49
234
Congratulations to the latest Turing award winners Barto and Sutton! At this point, all three Canadian national AI institutes have a pioneering Turing winner in leadership: Amii - Rich Sutton Vector - Geoff Hinton Mila - Yoshua Bengio Canada AI haters real quiet right now.
12
34
411
Our MIT class “6.S184: Introduction to Flow Matching and Diffusion Models” is now available on YouTube! We teach state-of-the-art generative AI algorithms for images, videos, proteins, etc. together with the mathematical tools to understand them. https://t.co/wDJcM1YTxJ (1/4)
76
595
4K
Are you a Canadian citizen or PR? 3rd or 4th year in undergrad? Available June 2-13 for a fully funded summer school in statistics? Learn more and apply here: https://t.co/zBfdor6DEa Yours truly will be teaching you some online learning theory.
statistics.utoronto.ca
The Statistical Sciences Research Program (UTSSRP) invites Canada’s top undergraduates in statistics, data sciences, and mathematics.
0
13
37