
Jonas Spinner
@jonas_spinner
Followers
76
Following
43
Media
7
Statuses
13
PhD student in high energy physics
Heidelberg, Germany
Joined May 2024
Excited to share the Lorentz-equivariant Geometric Algebra Transformer (L-GATr) for high-energy physics. w/ @victorbreso @pimdehaan @jessethaler Tilman Plehn and @johannbrehmer .
3
7
42
RT @javivilladamigo: Can transformers learn the universal pattern of jet radiation and extrapolate beyond training data?. Preprint .'Extrap….
0
1
0
Thanks to the L-GATr team @victorbreso @pimdehaan Tilman Plehn Huilin Qu @jessethaler @johannbrehmer . Looking forward to exciting discussions at NeurIPS!. 7/7.
0
0
3
Thrilled to announce that L-GATr is going to NeurIPS 2024! Plus, there is a new preprint with extended experiments and a more detailed explanation. Code: Physics paper: CS paper: 1/7.
arxiv.org
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr...
1
6
31
Interested in using L-GATr (Lorentz-equivariance + geometric algebra representations + transformer) for your own high-energy physics application? Check out the L-GATr codebase at
github.com
Repository for (J. Spinner et al 2024) - heidelberg-hepml/lorentz-gatr
Excited to share the Lorentz-equivariant Geometric Algebra Transformer (L-GATr) for high-energy physics. w/ @victorbreso @pimdehaan @jessethaler Tilman Plehn and @johannbrehmer .
0
3
16
We build the (to the best of our knowledge) first Lorentz-equivariant generative network. It uses Riemannian Flow Matching ( to hard-code phase space boundaries into the choice of trajectories.
Excited to share our new work on Riemannian Flow Matching. Unlike diffusion-based approaches, it’s. - completely simulation-free on simple manifolds,. - trivially applies to higher dimensions,. - tractably generalizes to general geometries!. w/ @lipmanya
1
4
27
L-GATr extends GATr ( to the Lorentz group, the symmetry group of spacetime. Following GATr, it combines strong inductive biases (equivariance, geometric algebra representations) with scalability (transformer architecture).
Are you dealing with geometric data, be it from molecules or robots? Would you like inductive biases *and* scalability?. Our Geometric Algebra Transformer (GATr 🐊) may be for you. New work w/ @pimdehaan, Sönke Behrends, and @TacoCohen:. 1/9
1
0
6