FabianFuchsML Profile Banner
Fabian Fuchs Profile
Fabian Fuchs

@FabianFuchsML

Followers
3K
Following
538
Media
16
Statuses
154

Research Scientist at DeepMind. Interested in invariant and equivariant neural nets and applications to the natural sciences. Views are my own.

Oxford, England
Joined June 2018
Don't wanna be here? Send us removal request.
@FabianFuchsML
Fabian Fuchs
4 years
A year ago I asked: Is there more than Self-Attention and Deep Sets? - and got very insightful answers. 🙏 Now, Ed, Martin and I wrote up our own take on the various neural networks architectures for sets. Have a look and tell us what you think! :). ➡️☕️
Tweet media one
@FabianFuchsML
Fabian Fuchs
5 years
Both Max-Pooling (e.g. DeepSets) and Self-Attention are permutation invariant/equivariant neural network architectures for set-based problems. I am aware of a couple of variations for both of these. Are there additional, fundamentally different architectures for sets? 🤔.
2
74
320
@FabianFuchsML
Fabian Fuchs
1 year
RT @impliedprobs: 🧵1/7 Interested in who will become the next US President and how recent news events influenced the chances of each candid….
0
3
0
@FabianFuchsML
Fabian Fuchs
2 years
RT @adam_golinski: Our Apple ML Research team in Barcelona is looking for a PhD intern! 🎓.Curiosity-driven research 🧠 with the goal to publ….
0
48
0
@FabianFuchsML
Fabian Fuchs
2 years
Graphs , Sets, Universality . We put more work into this and are presenting it via the ICLR blogpost track (thanks to organisers and reviewers!). Have a read and let us know what you think:. . better in light mode💡, dark mode🌙 messes with the latex a bit.
@PetarV_93
Petar Veličković
3 years
📢 New blog post!.Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) & the seminal work on set representations (Wagstaff @FabianFuchsML @martinengelcke @IngmarPosner @maosbot), Fabian and I join forces to attempt to explain!.
0
8
43
@FabianFuchsML
Fabian Fuchs
2 years
RT @arkosiorek: Text-to-image diffusion models seem to have a good idea of geometry. Can we extract that geometry? Or maybe we can nudge th….
0
25
0
@FabianFuchsML
Fabian Fuchs
3 years
RT @PetarV_93: 📢 New blog post!.Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) &….
0
12
0
@FabianFuchsML
Fabian Fuchs
3 years
I have recently had a range of very insightful conversations with @PetarV_93 about graph neural networks, networks on sets, universality and how ideas have spread in the two communities. This is our write up, feedback welcome as always! :). ➡️☕️
Tweet media one
2
44
201
@FabianFuchsML
Fabian Fuchs
3 years
RT @sokrypton: Anyone know of a department looking to hire faculty in the protein/genome+evolution+ML space? Also RNA biology (asking for a….
0
24
0
@FabianFuchsML
Fabian Fuchs
3 years
RT @arkosiorek: New blog post! Find out:.- what reconstructing masked images and our brains have in common,.- why reconstructing masked ima….
0
15
0
@FabianFuchsML
Fabian Fuchs
3 years
Graph neural networks often have to globally aggregate over all nodes. How we do this can have a significant impact on performance 🎯. After we recently finished a project on this, I wrote a blog post on this topic. Let me know what you think! :). ➡️☕️
Tweet media one
6
83
486
@FabianFuchsML
Fabian Fuchs
3 years
RT @emiel_hoogeboom: Molecule Generation in 3D with Equivariant Diffusion (. Very happy to share this project (the….
0
66
0
@FabianFuchsML
Fabian Fuchs
4 years
I should have said 'no physics background knowledge required' - the blog post does assume general machine learning background knowledge :).
0
0
7
@FabianFuchsML
Fabian Fuchs
4 years
Emmy Noether connected symmetries and conserved quantities in physics - how is this related to exploiting symmetries with neural networks? 🤔. I've tried to answer this question in a blog post (no background knowledge required!): . ➡️☕️
Tweet media one
6
41
226
@FabianFuchsML
Fabian Fuchs
4 years
Amazing video! Fantastic book, too, I am glad it receives all this attention. Many of those ideas & concepts are very fundamental and so helpful to understand, regardless of which specific sub-field of machine learning one is in.
@MLStreetTalk
Machine Learning Street Talk
4 years
Epic special edition MLST on geometric deep learning! Been in the works since May! with @mmbronstein @PetarV_93 @TacoCohen @joanbruna @ykilcher @ecsquendor
Tweet media one
0
5
34
@FabianFuchsML
Fabian Fuchs
4 years
RT @IngmarPosner: Metacognition in AI? We'd love for you to join us at #NeurIPS2021 and tell us about your work. Closing soon. #ai #artif….
Tweet card summary image
sites.google.com
0
12
0
@FabianFuchsML
Fabian Fuchs
4 years
RT @adam_golinski: What startups are out there these days that need or make good use of the probabilistic ML toolkit? (thinking mostly: gen….
0
18
0
@FabianFuchsML
Fabian Fuchs
4 years
RT @sokrypton: Just to clarify. RoseTTAFold uses SE3 transformers (the core part of the structure module) and Nvidia just released a much f….
0
5
0
@FabianFuchsML
Fabian Fuchs
4 years
RT @sokrypton: RoseTTAFold about to become 21X faster?."NVIDIA just released an open-source optimized implementation that uses 9x less memo….
0
58
0
@FabianFuchsML
Fabian Fuchs
4 years
Thank you for this great write-up about AF2!. "The SE(3)-Transformer [. ] is currently too computationally.expensive for protein-scale tasks".➡️ To be honest, I initially thought so as well; but @minkbaek et al made it work! 😍.
Tweet media one
0
16
76
@FabianFuchsML
Fabian Fuchs
4 years
Particularly the proof in chapter 4 (entirely @EdWagstaff 's work) blows my mind🤯. He worked on this proof for an entire year and, imho, totally worth it. I love his talent for making abstract theory accessible for the reader. Blog post will follow 🙂. ☕️
Tweet media one
2
14
86