
GAMA Miguel Angel ๐ฆโโฌ๐
@miangoar
Followers
2K
Following
52K
Media
547
Statuses
4K
Biologist that navigates in the oceans of diversity through the space-time | MSc in Biochem/Bioinfo @ibt_unam ๐ฒ๐ฝ | Protein evo, metagenomics & AI/ML/DL
Mexico
Joined November 2020
Thrilled to announce our new preprint, โProtein Hunter: Exploiting Structure Hallucination within Diffusion for Protein Design,โ in collaboration with Griffin, @GBhardwaj8 and @sokrypton ๐งฌCode and notebooks will be released by the end of this week. ๐งGolden- Kpop Demon Hunters
2
48
272
Protein Language Models are Accidental Taxonomists 1. A new study revealing a significant issue in protein-protein interaction (PPI) prediction models. These models, which use protein language models (pLMs), have been found to exploit phylogenetic distances rather than genuine
1
20
120
I just want to create hype and say that I made a 10-class course to introduce people to AI-driven protein design. Itโs around 750 slides and will be freely available for anyone who wants to use them and, most importantly, improve them. Stay tuned :)
4
14
118
A year ago David Baker won the Nobel Prize. Since then, we're a grateful universe.
0
0
18
This October Iโm drawing 1 molecule a day inspired by proteins in pdb @buildmodels #Inktober2025 Day 8 Prompt RECKLESS Pdb: 5TZO A reckless surge. Two molecules of fentanyl engulfed by its computationally designed binder Fen49* Next: HEAVY Suggestions?
10
19
247
AlphaFoldDB updated, now includes the MSAs ๐ฅณ
Weโre renewing our collaboration with @GoogleDeepMind! We'll keep developing the AlphaFold Database to support protein science worldwide ๐ To mark the moment weโve synchronised the database with UniProtKB release 2025_03 https://t.co/VTzPuEUC1r
#AlphaFold
4
45
342
bonus: Itโs kind of funny that attention is behind all breakthrough models, and no other operation has surpassed it so far. In fact, since 2017, two friends have had a bet that attention will still be the main paradigm by 2027. https://t.co/LtPICueC2F
0
0
3
9/9 ... by a multi-layer perceptron before the condensed info is passed to the next layer. The relevance of transformers is that they model all pairwise interactions between words simultaneously, allowing them to capture long-range relationships between words.
1
0
0
8/9 In order to capture diverse relationships between words in a sentence, transformers use multiple attention heads in each layer. This allows them to capture more complex relationships as the architecture goes deeper. And finally the non-linear relationships are handled ...
1
0
0
7/9 The dot product is an important operation because it combines first the qk vectors to determine which relationships between words (i.e. tokens) are more important. Then, the resulting attention weights are multiplied by the v vectors to aggregate the relevant information.
1
0
0
6/9 qkv are the main components for computing attention, which is how transformers assign importance to the relationships between words. Through attention, transformers learn the contextual meaning of words by computing their pairwise relationships via the dot product
1
0
0
5/9 The attention mechanism used in transformers assumes that the meaning of words is determined by their contextual relationships with other words. Indeed, the meaning of a word can change a lot depending on its context.
1
0
0
4/9 When we want to retrieve specific info, we know which key to look for. However, transformers donโt, they have to learn it. Thatโs why they use a new element: the query (q). So, what do Q, K, and V represent? q = what Iโm looking for k = what topic I carry v = the info itself
1
0
1
3/9 Keys (k) can be seen as the topic of info contained, while the values (v) is the info itself, which can be a list, a string, a function, etc. The central idea of transformers is that neural nets can be seen as a kind of probabilistic key-value database.
1
0
1
2/9 Dicts are inspired by a type of DB called key-value stores. Suppose you create a dict of the things youโve learned about u'r crush likes: drinks, foods, places, etc. Python allows you to organize each field so that every key always corresponds to a value. {key : value}
1
0
1
1/9 If youโre like me, i.e. someone with a background in the life sciences rather than computer science, let me share a summary of how I understood how transformers work using an analogy with Python dictionaries.
1
12
90
This October Iโm drawing 1 molecule a day inspired by proteins in pdb @buildmodels #Inktober2025 Day 7 Prompt STARFISH Pdb: 6KG7 Starfish relies on sensory information. This is Piezo 2 - mechanically activated ion channel responsible for the sense of touch. Next: RECKLESS
2
12
165
Excited to announce mBER, our fully open AI tool for de novo design of epitope-specific antibodies. To validate, we ran the largest de novo antibody experiment to date: >1M designs tested against 145 targets, measuring >100M interactions. We found specific binders for nearly half
14
132
608
New in-depth blog post time: "Inside NVIDIA GPUs: Anatomy of high performance matmul kernels". If you want to deeply understand how one writes state of the art matmul kernels in CUDA read along. (Remember matmul is the single most important operation that transformers execute
48
393
3K