Fabian Fuchs
@FabianFuchsML
Followers
3K
Following
537
Media
16
Statuses
154
Research Scientist at DeepMind. Interested in invariant and equivariant neural nets and applications to the natural sciences. Views are my own.
Oxford, England
Joined June 2018
A year ago I asked: Is there more than Self-Attention and Deep Sets? - and got very insightful answers. 🙏 Now, Ed, Martin and I wrote up our own take on the various neural networks architectures for sets. Have a look and tell us what you think! :) ➡️ https://t.co/Z1aprTcLQV ☕️
Both Max-Pooling (e.g. DeepSets) and Self-Attention are permutation invariant/equivariant neural network architectures for set-based problems. I am aware of a couple of variations for both of these. Are there additional, fundamentally different architectures for sets? 🤔
2
74
319
🧵1/7 Interested in who will become the next US President and how recent news events influenced the chances of each candidate? Let's dive into how election odds can offer insights. 👀
1
3
5
Our Apple ML Research team in Barcelona is looking for a PhD intern! 🎓 Curiosity-driven research 🧠 with the goal to publish 📝 Topics: Confidence/uncertainty quantification and reliability of LLMs 🤖 Apple here:
4
48
279
Graphs , Sets, Universality We put more work into this and are presenting it via the ICLR blogpost track (thanks to organisers and reviewers!). Have a read and let us know what you think: https://t.co/MtpPYYUfTm better in light mode💡, dark mode🌙 messes with the latex a bit
📢 New blog post! Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) & the seminal work on set representations (Wagstaff @FabianFuchsML @martinengelcke @IngmarPosner @maosbot), Fabian and I join forces to attempt to explain!
0
8
43
Text-to-image diffusion models seem to have a good idea of geometry. Can we extract that geometry? Or maybe we can nudge these models to create large 3D consistent environments? Here's a blog summarizing some ideas in this space :) https://t.co/INlMuslCdU
0
25
138
📢 New blog post! Realising an intricate connection between PNA (@GabriCorso @lukecavabarrett @dom_beaini @pl219_Cambridge) & the seminal work on set representations (Wagstaff @FabianFuchsML @martinengelcke @IngmarPosner @maosbot), Fabian and I join forces to attempt to explain!
I have recently had a range of very insightful conversations with @PetarV_93 about graph neural networks, networks on sets, universality and how ideas have spread in the two communities. This is our write up, feedback welcome as always! :) ➡️ https://t.co/vbmCGHBHxd ☕️
1
11
55
I have recently had a range of very insightful conversations with @PetarV_93 about graph neural networks, networks on sets, universality and how ideas have spread in the two communities. This is our write up, feedback welcome as always! :) ➡️ https://t.co/vbmCGHBHxd ☕️
2
44
201
Anyone know of a department looking to hire faculty in the protein/genome+evolution+ML space? Also RNA biology (asking for a friend) 🙂🥼🧪
20
24
116
New blog post! Find out: - what reconstructing masked images and our brains have in common, - why reconstructing masked images is a good idea for learning representations, - what makes a good mask and how to learn one https://t.co/bFrmQvITEz
2
15
96
Graph neural networks often have to globally aggregate over all nodes. How we do this can have a significant impact on performance 🎯. After we recently finished a project on this, I wrote a blog post on this topic. Let me know what you think! :) ➡️ https://t.co/OGJJAF9w9C ☕️
6
82
483
Molecule Generation in 3D with Equivariant Diffusion ( https://t.co/4ZgiHdswER). Very happy to share this project (the last of my PhD woohoo 🥳) and a super nice collab with @vgsatorras @ClementVignac (equal contrib shared among three of us) and of course @wellingmax
6
66
401
I should have said 'no physics background knowledge required' - the blog post does assume general machine learning background knowledge :)
0
0
7
Emmy Noether connected symmetries and conserved quantities in physics - how is this related to exploiting symmetries with neural networks? 🤔 I've tried to answer this question in a blog post (no background knowledge required!): ➡️ https://t.co/VfxMu3fkAK ☕️
6
41
225
Amazing video! Fantastic book, too, I am glad it receives all this attention. Many of those ideas & concepts are very fundamental and so helpful to understand, regardless of which specific sub-field of machine learning one is in.
Epic special edition MLST on geometric deep learning! Been in the works since May! with @mmbronstein @PetarV_93 @TacoCohen @joanbruna @ykilcher @ecsquendor
https://t.co/DR4ZJmBMk2
0
5
34
Metacognition in AI? We'd love for you to join us at #NeurIPS2021 and tell us about your work. Closing soon... #ai #artificialintelligence
#ml #machinelearning #robotics #cogsci
sites.google.com
0
12
10
What startups are out there these days that need or make good use of the probabilistic ML toolkit? (thinking mostly: generative modeling, prob. inference, BO/active learning) Slowly starting to explore the job market and need your help! 🙏 RT
13
18
77
Just to clarify. RoseTTAFold uses SE3 transformers (the core part of the structure module) and Nvidia just released a much faster version of this core part. The current code does not yet integrate these updates.
0
5
15
RoseTTAFold about to become 21X faster? "NVIDIA just released an open-source optimized implementation that uses 9x less memory and is up to 21x faster than the baseline official implementation." https://t.co/hY00CHvhLn
5
57
237
Thank you for this great write-up about AF2! https://t.co/ttQjaksxWh "The SE(3)-Transformer [...] is currently too computationally expensive for protein-scale tasks" ➡️ To be honest, I initially thought so as well; but @minkbaek et al made it work! 😍 https://t.co/wpQvOG16Sy
0
16
76
Particularly the proof in chapter 4 (entirely @EdWagstaff 's work) blows my mind🤯. He worked on this proof for an entire year and, imho, totally worth it. I love his talent for making abstract theory accessible for the reader. Blog post will follow 🙂 https://t.co/8AjiSjUZ2H ☕️
2
14
86