Mathurin Massias Profile
Mathurin Massias

@mathusmassias

Followers
2K
Following
2K
Media
80
Statuses
369

Researcher @INRIA_lyon, Ockham team. Teacher @Polytechnique and @ENSdeLyon Machine Learning, Python and Optimization

Paris, France
Joined July 2018
Don't wanna be here? Send us removal request.
@mathusmassias
Mathurin Massias
4 months
New paper on the generalization of Flow Matching https://t.co/BJMHUnY6xJ ๐Ÿคฏ Why does flow matching generalize? Did you know that the flow matching target you're trying to learn **can only generate training points**? with @Qu3ntinB, Anne Gagneux & Rรฉmi Emonet ๐Ÿ‘‡๐Ÿ‘‡๐Ÿ‘‡
17
257
1K
@Qu3ntinB
Quentin Bertrand
28 days
I am thrilled to announce that our work on the generalization of flow matching has been accepted to NeurIPS as an oral!! See you in San Diego ๐Ÿ˜Ž
@mathusmassias
Mathurin Massias
4 months
New paper on the generalization of Flow Matching https://t.co/BJMHUnY6xJ ๐Ÿคฏ Why does flow matching generalize? Did you know that the flow matching target you're trying to learn **can only generate training points**? with @Qu3ntinB, Anne Gagneux & Rรฉmi Emonet ๐Ÿ‘‡๐Ÿ‘‡๐Ÿ‘‡
5
67
589
@mathusmassias
Mathurin Massias
1 month
One-day workshop on Diffusion models and Flow matching, October 24th at ENS Lyon Registration and call for contributions (short talk and poster) are open at
gdr-iasis.cnrs.fr
Les inscriptions ร  la rรฉunion sont closes, mais il est toujours possible de participer ร  distance. Un lien Zoom sera affichรฉ sur cette page quelques jours avant la rรฉunion. Physical registration is...
1
7
22
@tonysilveti
Tony S.F.
4 months
This is happening today, 6pm Paris time!
@Cohere_Labs
Cohere Labs
4 months
Don't forget to join us tomorrow, July 3rd as we host @tonysilveti for a session on "Training neural networks at any scale" Learn more:
0
3
20
@eugene_ndiaye
Eugene Ndiaye
4 months
Some ๐Ÿ“ธ๐Ÿคณfrom the ongoing #MlssSenegal2025 ๐Ÿ™Œ๐Ÿฟ
3
19
58
@mathusmassias
Mathurin Massias
4 months
Then why does flow matching generalize?? Because it fails! The inductive bias of the neural network prevents from perfectly learning u* and overfitting In particular networks fail to learn the velocity field for two particular time values See the paper for a finer analysis ๐Ÿ˜€
1
5
59
@mathusmassias
Mathurin Massias
4 months
We propose to regress directly against the optimal (deterministic) u* and show that it never degrades the performance On the opposite, removing target stochasticity helps generalizing faster.
1
2
25
@mathusmassias
Mathurin Massias
4 months
Yet FM generates new samples! An hypothesis to explain this paradox is target stochasticity: FM targets the conditional velocity field i.e. only a stochastic approximation of the full velocity field u* *We refute this hypothesis*: very early, the approximation almost equals u*
1
2
36
@umutsimsekli
Umut U. Simsekli
5 months
Same here. My solution is I gave "very low" grade to 99% of the papers
@Ofirlin
Ofir Lindenbaum
5 months
This is quite strange, I received a batch of papers that mostly do not relate to my expertise. @NeurIPSConf
0
3
9
@mathusmassias
Mathurin Massias
6 months
On Saturday Anne will also present some very, very cool work on how to leverage Flow Matching models to obtain sota Plug and Play methods: PnP-Flow: Plug-and-Play Image Restoration with Flow Matching, poster #150 in poster session 6, Saturday at 3 pm https://t.co/0GRNZd3l8O
Tweet card summary image
arxiv.org
In this paper, we introduce Plug-and-Play (PnP) Flow Matching, an algorithm for solving imaging inverse problems. PnP methods leverage the strength of pre-trained denoisers, often deep neural...
0
0
3
@mathusmassias
Mathurin Massias
6 months
It was received quite enthusiastically here so time to share it again!!! Our #ICLR2025 blog post on Flow M atching was published yesterday : https://t.co/2V5BLl6T2p My PhD student Anne Gagneux will present it tomorrow in ICLR, ๐Ÿ‘‰poster session 4, 3 pm, #549 in Hall 3/2B ๐Ÿ‘ˆ
1
5
11
@eugene_ndiaye
Eugene Ndiaye
10 months
MLSS is coming to Senegal ๐Ÿ‡ธ๐Ÿ‡ณ in 2025! ๐ŸŒ ๐Ÿ“ AIMS Mbour, Senegal ๐Ÿ“… June 23 - July 4, 2025 An international summer school to explore, collaborate, and deepen your understanding of machine learning in a unique and welcoming environment. Details:
2
33
60
@Qu3ntinB
Quentin Bertrand
10 months
For people who could not attend the #NeurIPS2024 tutorial on flow-matching, here is a friendly introduction! https://t.co/IJIO2Bzo8X
@RickyTQChen
Ricky T. Q. Chen
10 months
Flow Matching tutorial slides: https://t.co/gZ8qg7yVvU
0
93
699
@mathusmassias
Mathurin Massias
11 months
The optimization loss in FM is easy to evaluate, and does not require integration like in CNF. The whole process is smooth! The illustrations are much nicer in the blog post, go read it ! ๐Ÿ‘‰๐Ÿ‘‰ https://t.co/TSkg1VZ5cn ๐Ÿ‘ˆ๐Ÿ‘ˆ
1
2
12
@mathusmassias
Mathurin Massias
11 months
FM learns a vector field u pushing the base distrib to the target through an ODE To learn it, introduce a conditioning random variable, breaking the pb into smaller ones that have closed form solutions Here's the magic: the small problems can be used to solve the original one!
1
1
9
@mathusmassias
Mathurin Massias
11 months
FM is a technique to train continuous normalizing flows (CNF) that progressively transform a simple base distrib to the target one 2 benefits -no need for likelihoods nor ODE in training -makes the pb better posed by defining a *unique sequence of densities* from base to target
1
1
14
@mathusmassias
Mathurin Massias
11 months
Anne Gagneux, Sรฉgolรจne Martin, @qu3ntinb, Remi Emonet and I wrote a tutorial blog post on flow matching: https://t.co/TSkg1VZ5cn with lots of illustrations and intuition! We got this idea after their cool work on improving Plug and Play with FM: https://t.co/0GRNZd3l8O
5
98
585
@mathusmassias
Mathurin Massias
11 months
Looks like the last reason to stay here won't remain valid for long !
0
0
8
@mathusmassias
Mathurin Massias
11 months
btw this is also my first post on ๐Ÿฆ‹ with the handle mathurinmassias ๐Ÿ˜€
0
0
2
@mathusmassias
Mathurin Massias
11 months
New blog post: the Hutchinson trace estimator, or how to evaluate divergence/Jacobian trace cheaply. Fundamental for Continuous Normalizing Flows https://t.co/W94Q08QIRx
2
18
122