
Fabian Fuchs
@FabianFuchsML
Followers
3K
Following
538
Media
16
Statuses
154
Research Scientist at DeepMind. Interested in invariant and equivariant neural nets and applications to the natural sciences. Views are my own.
Oxford, England
Joined June 2018
A year ago I asked: Is there more than Self-Attention and Deep Sets? - and got very insightful answers. 🙏 Now, Ed, Martin and I wrote up our own take on the various neural networks architectures for sets. Have a look and tell us what you think! :). ➡️☕️
Both Max-Pooling (e.g. DeepSets) and Self-Attention are permutation invariant/equivariant neural network architectures for set-based problems. I am aware of a couple of variations for both of these. Are there additional, fundamentally different architectures for sets? 🤔.
2
74
320
RT @impliedprobs: 🧵1/7 Interested in who will become the next US President and how recent news events influenced the chances of each candid….
0
3
0
RT @adam_golinski: Our Apple ML Research team in Barcelona is looking for a PhD intern! 🎓.Curiosity-driven research 🧠 with the goal to publ….
0
48
0
Graphs , Sets, Universality . We put more work into this and are presenting it via the ICLR blogpost track (thanks to organisers and reviewers!). Have a read and let us know what you think:. . better in light mode💡, dark mode🌙 messes with the latex a bit.
📢 New blog post!.Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) & the seminal work on set representations (Wagstaff @FabianFuchsML @martinengelcke @IngmarPosner @maosbot), Fabian and I join forces to attempt to explain!.
0
8
43
RT @arkosiorek: Text-to-image diffusion models seem to have a good idea of geometry. Can we extract that geometry? Or maybe we can nudge th….
0
25
0
RT @PetarV_93: 📢 New blog post!.Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) &….
0
12
0
I have recently had a range of very insightful conversations with @PetarV_93 about graph neural networks, networks on sets, universality and how ideas have spread in the two communities. This is our write up, feedback welcome as always! :). ➡️☕️
2
44
201
RT @sokrypton: Anyone know of a department looking to hire faculty in the protein/genome+evolution+ML space? Also RNA biology (asking for a….
0
24
0
RT @arkosiorek: New blog post! Find out:.- what reconstructing masked images and our brains have in common,.- why reconstructing masked ima….
0
15
0
RT @emiel_hoogeboom: Molecule Generation in 3D with Equivariant Diffusion (. Very happy to share this project (the….
0
66
0
Amazing video! Fantastic book, too, I am glad it receives all this attention. Many of those ideas & concepts are very fundamental and so helpful to understand, regardless of which specific sub-field of machine learning one is in.
Epic special edition MLST on geometric deep learning! Been in the works since May! with @mmbronstein @PetarV_93 @TacoCohen @joanbruna @ykilcher @ecsquendor
0
5
34
RT @IngmarPosner: Metacognition in AI? We'd love for you to join us at #NeurIPS2021 and tell us about your work. Closing soon. #ai #artif….
sites.google.com
0
12
0
RT @adam_golinski: What startups are out there these days that need or make good use of the probabilistic ML toolkit? (thinking mostly: gen….
0
18
0
RT @sokrypton: Just to clarify. RoseTTAFold uses SE3 transformers (the core part of the structure module) and Nvidia just released a much f….
0
5
0
RT @sokrypton: RoseTTAFold about to become 21X faster?."NVIDIA just released an open-source optimized implementation that uses 9x less memo….
0
58
0
Thank you for this great write-up about AF2!. "The SE(3)-Transformer [. ] is currently too computationally.expensive for protein-scale tasks".➡️ To be honest, I initially thought so as well; but @minkbaek et al made it work! 😍.
0
16
76
Particularly the proof in chapter 4 (entirely @EdWagstaff 's work) blows my mind🤯. He worked on this proof for an entire year and, imho, totally worth it. I love his talent for making abstract theory accessible for the reader. Blog post will follow 🙂. ☕️
2
14
86