miangoar Profile Banner
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘ Profile
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘

@miangoar

Followers
2K
Following
52K
Media
547
Statuses
4K

Biologist that navigates in the oceans of diversity through the space-time | MSc in Biochem/Bioinfo @ibt_unam ๐Ÿ‡ฒ๐Ÿ‡ฝ | Protein evo, metagenomics & AI/ML/DL

Mexico
Joined November 2020
Don't wanna be here? Send us removal request.
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
1 day
Sometimes I think a lot about this phrase from @fchollet, and I always remember this scene from Wallace and Gromit ๐Ÿค”: "Deep learning research is an evolution process driven by poorly-understood empirical results"
0
0
1
@ChoYehlin
Yehlin Cho
2 days
Thrilled to announce our new preprint, โ€œProtein Hunter: Exploiting Structure Hallucination within Diffusion for Protein Design,โ€ in collaboration with Griffin, @GBhardwaj8 and @sokrypton ๐ŸงฌCode and notebooks will be released by the end of this week. ๐ŸŽงGolden- Kpop Demon Hunters
2
48
272
@BiologyAIDaily
Biology+AI Daily
3 days
Protein Language Models are Accidental Taxonomists 1. A new study revealing a significant issue in protein-protein interaction (PPI) prediction models. These models, which use protein language models (pLMs), have been found to exploit phylogenetic distances rather than genuine
1
20
120
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
6 days
I just want to create hype and say that I made a 10-class course to introduce people to AI-driven protein design. Itโ€™s around 750 slides and will be freely available for anyone who wants to use them and, most importantly, improve them. Stay tuned :)
4
14
118
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
6 days
A year ago David Baker won the Nobel Prize. Since then, we're a grateful universe.
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
1 year
Every protinologist today after seeing that David Baker won the #NobelPrize
0
0
18
@IrinaBezsonova
Irina Bezsonova
6 days
This October Iโ€™m drawing 1 molecule a day inspired by proteins in pdb @buildmodels #Inktober2025 Day 8 Prompt RECKLESS Pdb: 5TZO A reckless surge. Two molecules of fentanyl engulfed by its computationally designed binder Fen49* Next: HEAVY Suggestions?
10
19
247
@sokrypton
Sergey Ovchinnikov
7 days
AlphaFoldDB updated, now includes the MSAs ๐Ÿฅณ
@emblebi
EMBL-EBI
8 days
Weโ€™re renewing our collaboration with @GoogleDeepMind! We'll keep developing the AlphaFold Database to support protein science worldwide ๐ŸŽ‰ To mark the moment weโ€™ve synchronised the database with UniProtKB release 2025_03 https://t.co/VTzPuEUC1r #AlphaFold
4
45
342
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
bonus: Itโ€™s kind of funny that attention is behind all breakthrough models, and no other operation has surpassed it so far. In fact, since 2017, two friends have had a bet that attention will still be the main paradigm by 2027. https://t.co/LtPICueC2F
0
0
3
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
9/9 ... by a multi-layer perceptron before the condensed info is passed to the next layer. The relevance of transformers is that they model all pairwise interactions between words simultaneously, allowing them to capture long-range relationships between words.
1
0
0
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
8/9 In order to capture diverse relationships between words in a sentence, transformers use multiple attention heads in each layer. This allows them to capture more complex relationships as the architecture goes deeper. And finally the non-linear relationships are handled ...
1
0
0
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
7/9 The dot product is an important operation because it combines first the qk vectors to determine which relationships between words (i.e. tokens) are more important. Then, the resulting attention weights are multiplied by the v vectors to aggregate the relevant information.
1
0
0
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
6/9 qkv are the main components for computing attention, which is how transformers assign importance to the relationships between words. Through attention, transformers learn the contextual meaning of words by computing their pairwise relationships via the dot product
1
0
0
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
5/9 The attention mechanism used in transformers assumes that the meaning of words is determined by their contextual relationships with other words. Indeed, the meaning of a word can change a lot depending on its context.
1
0
0
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
4/9 When we want to retrieve specific info, we know which key to look for. However, transformers donโ€™t, they have to learn it. Thatโ€™s why they use a new element: the query (q). So, what do Q, K, and V represent? q = what Iโ€™m looking for k = what topic I carry v = the info itself
1
0
1
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
3/9 Keys (k) can be seen as the topic of info contained, while the values (v) is the info itself, which can be a list, a string, a function, etc. The central idea of transformers is that neural nets can be seen as a kind of probabilistic key-value database.
1
0
1
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
2/9 Dicts are inspired by a type of DB called key-value stores. Suppose you create a dict of the things youโ€™ve learned about u'r crush likes: drinks, foods, places, etc. Python allows you to organize each field so that every key always corresponds to a value. {key : value}
1
0
1
@miangoar
GAMA Miguel Angel ๐Ÿฆโ€โฌ›๐Ÿ”‘
7 days
1/9 If youโ€™re like me, i.e. someone with a background in the life sciences rather than computer science, let me share a summary of how I understood how transformers work using an analogy with Python dictionaries.
1
12
90
@IrinaBezsonova
Irina Bezsonova
7 days
This October Iโ€™m drawing 1 molecule a day inspired by proteins in pdb @buildmodels #Inktober2025 Day 7 Prompt STARFISH Pdb: 6KG7 Starfish relies on sensory information. This is Piezo 2 - mechanically activated ion channel responsible for the sense of touch. Next: RECKLESS
2
12
165
@PierceOgdenJ
Pierce
15 days
Excited to announce mBER, our fully open AI tool for de novo design of epitope-specific antibodies. To validate, we ran the largest de novo antibody experiment to date: >1M designs tested against 145 targets, measuring >100M interactions. We found specific binders for nearly half
14
132
608
@gordic_aleksa
Aleksa Gordiฤ‡ (ๆฐดๅนณ้—ฎ้ข˜)
16 days
New in-depth blog post time: "Inside NVIDIA GPUs: Anatomy of high performance matmul kernels". If you want to deeply understand how one writes state of the art matmul kernels in CUDA read along. (Remember matmul is the single most important operation that transformers execute
48
393
3K