LouisSerrano31 Profile Banner
Louis Serrano Profile
Louis Serrano

@LouisSerrano31

Followers
65
Following
396
Media
2
Statuses
117

Phd Student @ Sorbonne University - DL for Physics

Joined January 2012
Don't wanna be here? Send us removal request.
@FrancoisRozet
François Rozet
1 month
Does a smaller latent space lead to worse generation in latent diffusion models? Not necessarily! We show that LDMs are extremely robust to a wide range of compression rates (10-1000x) in the context of physics emulation. We got lost in latent space. Join us 👇
14
91
465
@cosmo_shirley
Shirley Ho
2 months
The final days of summer are upon us, and it is bittersweet to say goodbye to our great group of @PolymathicAI interns! 😭 @JacopoTeneggi @cskokgibbs @astro_nolan @CristianaD2202 @LouisSerrano31 @rachelczhang Here are a few pics to remind us all the fun we had! (and hold your
0
6
29
@astro_nolan
Nolan Koblischke
3 months
Thanks everyone who came to my poster @icmlconf. I'm so happy to feel the excitement about using physics simulations to test and train science agents.
0
2
16
@LouisSerrano31
Louis Serrano
3 months
Very excited to present Zebra at ICML today! Zebra is an LLM trained with in-context examples on physics data to directly adapt to new spatiotemporal dynamics, bypassing gradient steps at inference. Come check out my poster at 11:00 AM, West Hall B2-B3 #W-106.
1
1
4
@rdMorel
Rudy Morel
3 months
For evolving unknown PDEs, ML models are trained on next-state prediction. But do they actually learn the time dynamics: the "physics"? Check out our poster (W-107) at #ICML2025 this Wed, Jul 16. Our "DISCO" model learns the physics while staying SOTA on next states prediction!
4
50
302
@jo_brandstetter
Johannes Brandstetter
4 months
We release AB-UPT, a novel method to scale neural surrogates to CFD meshes beyond 100 million of mesh cells. AB-UPT is extensively tested on the largest publicly available datasets. 📄 https://t.co/xGQxhU8PuJ 🤗 https://t.co/WIirIMyNNd 💻 https://t.co/VuXjboZ0Xo
1
18
68
@PaulCouairon
Paul Couairon
4 months
Given an image, JAFAR builds high-res queries at the target resolution and low-res, semantically enriched keys using spatial feature modulation to power a cross-resolution attention mechanism that interpolates the low-resolution features from the foundation vision encoder (3/n)
1
1
7
@PaulCouairon
Paul Couairon
4 months
🚀Thrilled to introduce JAFAR—a lightweight, flexible, plug-and-play module that upsamples features from any Foundation Vision Encoder to any desired output resolution (1/n) Paper : https://t.co/le4pF8rVXH Project Page: https://t.co/rLW3jbin3O Github: https://t.co/1AL7ElibHf
7
75
483
@jo_brandstetter
Johannes Brandstetter
6 months
Today marks a big milestone for us at Emmi AI. We’ve raised a €15M seed round, backed by 3VC, Speedinvest, Serena, and PUSH. Let’s build the future of Physics AI together!
8
11
68
@Jeffaresalan
Alan Jeffares @ ICML 🇨🇦
7 months
me trying to cut my ICML rebuttal down to <5000 characters
1
1
26
@mlia_isir
MLIA
8 months
"Learning a Neural Solver for Parametric PDEs to Enhance Physics-Informed Methods" by Lise Le Boudec, @EBezenac, @LouisSerrano31, Ramon Daniel Regueiro-Espino, @yuanyinnn, Patrick Gallinari in collaboration with ETH Zürich, @CriteoAILab, @valeoai, INRIA ➡️ https://t.co/H5zRcd0enw
1
1
3
@IAmEricHedlin
Eric Hedlin
9 months
We present Hypernetwork Fields. We estimate the entire convergence trajectory for hypernetworks by introducing an extra variable representing the state of convergence. We show results for our model estimating DreamBooth parameters. 1/N🧵
8
64
353
@gcouairon
Guillaume Couairon
10 months
Proud to announce ArchesWeatherGen, an #opensource flow matching model for weather forecasting. ArchesWeatherGen surpasses IFS ENS (the reference operational model run by ECMWF) and NeuralGCM (hybrid physics-ML). Paper: https://t.co/f9tkI79Kia Code: https://t.co/9O8aXnBczT
1
8
53
@itsbautistam
Miguel Angel Bautista
10 months
Here's one to read on your flight to #NeurIPS2024! A flow-matching transformer model in function space! This model has all the advantages of neural fields: resolution-free generation and domain-agnostic architecture, while obtaining strong results on ImageNet-256 and Objaverse!
@YuyangW95
Yuyang Wang
10 months
1/n 🚨New preprint! Our work “Coordinate In and Value Out: Training Flow Transformers in Ambient Space” https://t.co/3VwBEKFJ9F presents a domain-agnostic and end2end flow-matching generative model that effectively handles various modalities like images and point clouds.
2
13
64
@mlia_isir
MLIA
10 months
@MustafaShukor1 and @LouisSerrano31 presented their poster during day 2:
0
1
4
@sarah_perrin_
Sarah Perrin
11 months
♟️Mastering Board Games by External and Internal Planning with Language Models♟️ I'm happy to finally share https://t.co/1Hc2p883Wf TL;DR: In chess, our planning agents effectively reach grandmaster-level strength with a comparable search budget to that of human players!
@weballergy
Nenad Tomasev
11 months
I'm excited to share a new paper: "Mastering Board Games by External and Internal Planning with Language Models" https://t.co/jWoSojZtbQ (also soon to be up on Arxiv, once it's been processed there)
1
12
31
@MustafaShukor1
Mustafa Shukor
11 months
We release AIMv2, a major step in scaling vision encoders. Properly scaling vision encoders has been challenging and lagging, compared to LLMs. The main bottleneck is training and evaluating on single image modality, (1/n)
2
31
166
@jo_brandstetter
Johannes Brandstetter
11 months
Super hyped to share NeuralDEM -- the first real-time simulation of industrial particulate flows. NeuralDEM replaces Discrete Element Method (DEM) routines and coupled (CFD-DEM) multiphysics simulations. 🧵 📜: https://t.co/JH4PDpth5g 🖥️: https://t.co/VEsawzd9IV
6
91
393
@mlia_isir
MLIA
11 months
"GEPS: Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning" by @ArmandKassai, @JorgeMifsut, @yuanyinnn, Jean Noël Vittaut, Patrick Gallinari accepted as conference paper at #NeurIPS2024 ! ➡️ https://t.co/ztSKO05ogo 🖥️ https://t.co/Pz4YSrEYLw
1
4
6