Jonas Spinner Profile
Jonas Spinner

@jonas_spinner

Followers
76
Following
43
Media
7
Statuses
13

PhD student in high energy physics

Heidelberg, Germany
Joined May 2024
Don't wanna be here? Send us removal request.
@jonas_spinner
Jonas Spinner
1 year
Excited to share the Lorentz-equivariant Geometric Algebra Transformer (L-GATr) for high-energy physics. w/ @victorbreso @pimdehaan @jessethaler Tilman Plehn and @johannbrehmer .
3
7
42
@jonas_spinner
Jonas Spinner
8 months
RT @javivilladamigo: Can transformers learn the universal pattern of jet radiation and extrapolate beyond training data?. Preprint .'Extrap….
0
1
0
@jonas_spinner
Jonas Spinner
9 months
Thanks to the L-GATr team @victorbreso @pimdehaan Tilman Plehn Huilin Qu @jessethaler @johannbrehmer . Looking forward to exciting discussions at NeurIPS!. 7/7.
0
0
3
@jonas_spinner
Jonas Spinner
9 months
We train continuous normalizing flows with Riemannian flow matching and several choices for the vector field architecture, and compare them with our autoregressive density estimator 'JetGPT'. CNFs turn out to be more data-efficient, and turning them equivariant also helps. 6/7
Tweet media one
1
0
1
@jonas_spinner
Jonas Spinner
9 months
For the first time, we have trained a Lorentz-equivariant architecture on a real-world tagging dataset (JetClass = 100M events). We find the hierarchy GNN < transformer < Lorentz-equivariant transformer, showing that equivariance also matters at scale. 5/7
Tweet media one
1
0
2
@jonas_spinner
Jonas Spinner
9 months
We implement the L-GATr attention as a multiplicative list of signs for the queries in the inner product, and then use off-the-shelf attention kernels. With this trick, L-GATr scales to many tokens like standard transformers. 4/7
Tweet media one
1
0
1
@jonas_spinner
Jonas Spinner
9 months
To build L-GATr, we replace each transformer module with a version that processes geometric algebra objects in a Lorentz-equivariant way. Plus, there is a new operation in geometric algebra that allows for an extra layer, the geometric product. 3/7
Tweet media one
1
0
1
@jonas_spinner
Jonas Spinner
9 months
The Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) uses spacetime geometric algebra to process particles at the LHC in a Lorentz-equivariant way. We process them using a transformer architecture, combining the benefits of Lorentz and permutation equivariance. 2/7
Tweet media one
1
0
2
@jonas_spinner
Jonas Spinner
9 months
Thrilled to announce that L-GATr is going to NeurIPS 2024! Plus, there is a new preprint with extended experiments and a more detailed explanation. Code: Physics paper: CS paper: 1/7.
Tweet card summary image
arxiv.org
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr...
1
6
31
@jonas_spinner
Jonas Spinner
1 year
Interested in using L-GATr (Lorentz-equivariance + geometric algebra representations + transformer) for your own high-energy physics application? Check out the L-GATr codebase at
Tweet card summary image
github.com
Repository for (J. Spinner et al 2024) - heidelberg-hepml/lorentz-gatr
@jonas_spinner
Jonas Spinner
1 year
Excited to share the Lorentz-equivariant Geometric Algebra Transformer (L-GATr) for high-energy physics. w/ @victorbreso @pimdehaan @jessethaler Tilman Plehn and @johannbrehmer .
0
3
16
@jonas_spinner
Jonas Spinner
1 year
L-GATr is as good as or better than SOTA on regression, classification, and generation tasks from particle physics. And there are many more problems in particle physics that L-GATr could help with.
Tweet media one
0
0
7
@jonas_spinner
Jonas Spinner
1 year
We build the (to the best of our knowledge) first Lorentz-equivariant generative network. It uses Riemannian Flow Matching ( to hard-code phase space boundaries into the choice of trajectories.
Tweet media one
@RickyTQChen
Ricky T. Q. Chen
3 years
Excited to share our new work on Riemannian Flow Matching. Unlike diffusion-based approaches, it’s. - completely simulation-free on simple manifolds,. - trivially applies to higher dimensions,. - tractably generalizes to general geometries!. w/ @lipmanya
1
4
27
@jonas_spinner
Jonas Spinner
1 year
L-GATr extends GATr ( to the Lorentz group, the symmetry group of spacetime. Following GATr, it combines strong inductive biases (equivariance, geometric algebra representations) with scalability (transformer architecture).
@johannbrehmer
Johann Brehmer
2 years
Are you dealing with geometric data, be it from molecules or robots? Would you like inductive biases *and* scalability?. Our Geometric Algebra Transformer (GATr 🐊) may be for you. New work w/ @pimdehaan, Sönke Behrends, and @TacoCohen:. 1/9
Tweet media one
1
0
6