amelie_schreiber Profile Banner
amelie_schreiber Profile
amelie_schreiber

@amelie_iska

Followers
1,020
Following
343
Media
27
Statuses
568

I ❤️ proteins! Researching protein language models, equivariant transformers, LoRA, QLoRA, DDPMs, flow matching, etc. intersex=awesome😎✡️🏳️‍🌈🏳️‍⚧️💻🧬❤️🇮🇱

California
Joined May 2023
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
@amelie_iska
amelie_schreiber
3 months
Top 10 ❤️ tools rn, in no particular order: 1. ProteinDT 2. MoleculesSTM 3. RFDiffusion-AA 4. RosettaFold-AA 5. LigandMPNN 6. Distributional Graphormer (DiG) 7. DNA-Diffusion 8. OAReactDiff 9. RFDiffusion (original) 10. EvoDiff ❤️ Evo ❤️ Flow matching ❤️ Boltzmann generators
2
27
196
@amelie_iska
amelie_schreiber
3 months
Protein binding a small molecule designed with RFDiffusion-AA yesterday. I'm such a huge fangirl for these all-atom models. Baker Lab is awesome!
Tweet media one
2
17
170
@amelie_iska
amelie_schreiber
4 months
These two together make a really good pair: From this you get conformational ensembles and binding affinity for protein-protein, protein-small molecule, and protein-nucleic acid affinities, reducing the need for expensive MD sims.
0
24
138
@amelie_iska
amelie_schreiber
3 months
Found out yesterday some of my @huggingface blogs inspired some undergrads to start studying AI applied to proteins and someone applied to and received an internship based on their interest in replicating and extending some of them. 😎 Feeling very inspired and grateful now. ❤️
4
8
130
@amelie_iska
amelie_schreiber
3 months
In case it is helpful:
1
13
94
@amelie_iska
amelie_schreiber
9 months
Just thought I would share this new Hugging Face community blog post I wrote as a follow up post to the ESMBind post. It explains how to build an ensemble of Low Rank Adaptations (LoRAs) after you have finetuned multiple ESMBind LoRA models:
1
13
57
@amelie_iska
amelie_schreiber
3 months
An interesting and novel approach to applying transformers to graph structured data. This never got the attention it deserved and is likely an approach lost to time. It maybe “old” but it’s worth investigating further, especially for biochem/molecules:
3
8
55
@amelie_iska
amelie_schreiber
3 months
Damn, another E(3)-equivariant model that should have been SE(3)-equivariant. Molecules have chirality! Still exciting that it works for small molecules AND proteins:
0
7
55
@amelie_iska
amelie_schreiber
3 months
Has anyone else tried grafting two proteins together by first placing the proteins into AlphaFold-Multimer, then linking the proteins together with something like RFDiffusion motif scaffolding (treating the two proteins as though they are in the same chain)?
Tweet media one
3
5
51
@amelie_iska
amelie_schreiber
3 months
Equivariant Spatio-Temporal Attentive Graph Networks to Simulate Physical Dynamics: A Replacement for MD? TBD. More comments to come. OpenReview: GitHub:
4
9
46
@amelie_iska
amelie_schreiber
11 days
BindGPT sounds pretty cool! No code though 😒 probably because they’re on to something with this one, especially when considering the performance and the inference cost drop together. High throughput is really needed for this problem.
3
2
49
@amelie_iska
amelie_schreiber
8 months
Recently wrote a new blog post on intrinsic dimension of protein language model embeddings and curriculum learning:
1
7
41
@amelie_iska
amelie_schreiber
7 months
Working on a new method to cluster protein-protein complexes so I can finetune ESM-2 on them for predicting PPIs and for generating binders 😊. Also may try to finetune EvoDiff this way for generating binders. I ❤️ proteins so much.
2
1
40
@amelie_iska
amelie_schreiber
3 months
Here’s a new method for sampling the equilibrium Boltzmann distribution for proteins using GFlowNets: If you aren’t familiar with GFlowNets, head over to @edwardjhu ’s twitter and watch his video. I’ll also post a link to a related lecture soon.
3
4
40
@amelie_iska
amelie_schreiber
7 months
Just cooked up a new tokenization method for protein language models and large language models. I can't wait to share :)
1
1
40
@amelie_iska
amelie_schreiber
2 months
Not specifically for proteins or other molecules, but this is a nice intro to flow matching. Thanks for the video @ykilcher any chance you’d ever do something on this applied to proteins?
0
8
38
@amelie_iska
amelie_schreiber
1 month
Whenever an open source version of #AlphaFold 3 is being created, be sure to try swapping out the diffusion module for a flow matching module. It’ll probably turn out better that way 😉
2
4
29
@amelie_iska
amelie_schreiber
20 days
ESM-AA huh? Would it be better to use random order autoregressive decoding (similar to LigandMPNN for example) instead of MLM? It seems like a harder objective to train on, but you could end up with a better performing model.
3
3
28
@amelie_iska
amelie_schreiber
2 months
Let’s go! “CRISPR-GPT leverages the reasoning ability of LLMs to facilitate the process of selecting CRISPR systems, designing guide RNAs, recommending cellular delivery methods, drafting protocols, and designing validation experiments to confirm editing outcomes.”
@biorxiv_bioinfo
bioRxiv Bioinfo
2 months
CRISPR-GPT: An LLM Agent for Automated Design of Gene-Editing Experiments #biorxiv_bioinfo
0
13
34
2
2
18
@amelie_iska
amelie_schreiber
6 months
Shouldn't we be able to do something similar to this with LoRA? LoRA and SVD are conceptually very similar. If so, that would likely explain the results in this paper where LoRA turns out to be better than full finetuning Thoughts?
0
1
18
@amelie_iska
amelie_schreiber
4 months
Apparently you can in fact do flow matching on discrete data, for those interested in diffusion applied to discrete data like language and NLP, this is a good reference for how to do it with the more general flow matching models:
@json_yim
Jason Yim
4 months
Combining discrete and continuous data is an important capability for generative models. To address this for protein design, we introduce Multiflow, a generative model for structure and sequence generation. Preprint: Code: 1/8
2
91
445
0
1
17
@amelie_iska
amelie_schreiber
9 days
Okay, hear me out…stratify an NLP dataset (or any other modality really) by using a “homology search” using a BERT style model, similar to this paper, but for non-protein data: Could help determine the amount of generalization, no?
2
0
15
@amelie_iska
amelie_schreiber
1 month
😎🧬
1
0
15
@amelie_iska
amelie_schreiber
3 months
Another E(3)-equivariant model that should be SE(3)-equivariant. E(3) doesn’t preserve chirality of molecules. GitHub:
@ML_Chem
Machine Learning in Chemistry
3 months
Transferable Water Potentials Using Equivariant Neural Networks #machinelearning #compchem
0
3
35
0
1
14
@amelie_iska
amelie_schreiber
2 months
C_4 symmetric motif scaffolding with RFDiffusion.
Tweet media one
0
0
14
@amelie_iska
amelie_schreiber
16 days
Love child from Distributional Graphormer ( #DiG ) and #alphafold3 when? C’mon @Microsoft and @GoogleDeepMind . If @OpenAI and @Apple can team up to deliver #Her we can also have a new model that does dynamics for complexes of biomolecules. You’re almost there 🔥🔥🔥you got this.
3
2
13
@amelie_iska
amelie_schreiber
2 months
This looks pretty amazing:
0
3
13
@amelie_iska
amelie_schreiber
7 months
So nervous about this one.
0
2
13
@amelie_iska
amelie_schreiber
23 days
Are Kolmogorov-Arnold Networks (KAN) enough to address some problems in biochemistry that suffer from data scarcity? Apparently they require much less data to converge, and all they’re really doing is making activation functions trainable using B-splines.
1
2
11
@amelie_iska
amelie_schreiber
27 days
Thank you for inviting me and for the wonderful conversation.
@labenz
Nathan Labenz
27 days
The AI Revolution in Biology is here - it's just not evenly distributed, even among biologists @amelie_iska previews biology as an experimental information science, on the latest Cognitive Revolution – out now! Listen to catch up! (link in thread)
2
4
22
2
2
11
@amelie_iska
amelie_schreiber
3 months
Interestingly, quantizing state space models like Mamba doesn't seem to work very well, whereas we are now in the era of 1-bit quantization for transformers ~without~ performance degradation; it also isn't clear if Mamba is as expressive as Transformers.
2
2
11
@amelie_iska
amelie_schreiber
7 months
If you have opportunities to work at the intersection of AI and proteins, DM me. I have ideas and I like implementing them :)
2
5
11
@amelie_iska
amelie_schreiber
10 days
RFDiffusion could’ve been used and it’s suggested this would improve outcomes. Why wasn’t it used for this problem? I’m curious. This would establish more use cases for RFDiffusion and similar methods (like flow matching with FoldFlow-2 for example).
@schlichthaerle
Thomas Schlichthaerle
11 days
Now online: We developed novel oligomers and turned them into FGFR agonists via binder induced receptor clustering. #denovo_proteins
Tweet media one
3
29
109
1
1
11
@amelie_iska
amelie_schreiber
3 months
(1/n) Even if Sora isn't currently capable of accurately generating simulations of small molecules or proteins, open sourcing it or giving select researcher access to it would allow us to add in equivariance or use components of it such as those that maintain temporal coherence.
4
0
10
@amelie_iska
amelie_schreiber
3 months
Okay, serious question. If you can accomplish the same thing with more general proteins, why restrict yourself to antibodies? Also, what are some problems that really truly require antibodies specifically and that can’t be done with more general proteins?
5
0
10
@amelie_iska
amelie_schreiber
3 months
Seems like an interesting method. I find it very interesting that it works better (SOTA?) if you give it conformational ensembles to work with. Could be very interesting to see how conformational sampling, Distributional Graphormer, or AlphaFlow might yield better results.
@JavierUtges
Javier Sánchez Utgés
3 months
Having a lot of fun visualising the ligand binding site predictions of #IFSitePred with #PyMol ! A new ligand binding site prediction method that uses #ESMIF1 learnt representations to predict where ligands bind! Check it out here: #Q96BI1
Tweet media one
0
2
20
2
1
10
@amelie_iska
amelie_schreiber
3 months
@TonyTheLion2500 I highly recommend this reference along with his “smooth manifolds” book: Introduction to Riemannian Manifolds (Graduate Texts in Mathematics)
0
0
10
@amelie_iska
amelie_schreiber
18 days
Key insight from recent events…patent the method not the molecule. Some of these AI methods are going to wreck patents imo. 🤫 That said… MIT license >> patent (for humanity…usually).
2
0
8
@amelie_iska
amelie_schreiber
18 days
@SimonDBarnett Code is linked to in the Nature paper. This AI model actually samples the Boltzmann distribution, giving all the metastable states (low energy conformations) as well as the transition pathways between them. It’s a “generative diffusion model”:
0
1
8
@amelie_iska
amelie_schreiber
10 days
Goal: use partial diffusion and motif scaffolding to engineer a new version of nitrogenase then modify plant genetics to produce this new version so that chemical fertilizer is unnecessary.
0
0
8
@amelie_iska
amelie_schreiber
3 months
To all those just getting into this stuff: You’re entering one of the most interesting and impactful areas at the most exciting time. Don’t give up, even when it feels impossible. Stay close to the open source biochem AI community. They’re a great crowd. Good luck and have fun!
1
0
7
@amelie_iska
amelie_schreiber
3 months
Having solid temporal coherence, or modifying the architecture to be SE(3)-equivariant would allow us to create better versions of things like this: and we might actually be able to replace MD with AI, speeding up drug discovery and solving major problems
0
0
6
@amelie_iska
amelie_schreiber
3 months
Crowdsourcing suggestion…if you could selectively disrupt or augment a pathway or PPI network, where would you start? Assume you can block any PPI, or augment the PPI network by designing proteins that create intermediary interactions (ex: proteins that bind/link two others)
3
0
6
@amelie_iska
amelie_schreiber
3 months
@alexrives I have a method for detecting AI generated proteins that I would like to open source at some point if people are interested. It seems to work on proteins generated by most models out right now, although there are a couple models it does not work for, hesitant to say which ones.
2
0
7
@amelie_iska
amelie_schreiber
3 months
Selectively modulating PPI networks by designing high affinity and high specificity binders with RFDiffusion and checking that with AF-Multimer LIS score seems like low hanging fruit to me. What reasons might there be for this not being very actively & heavily worked on?
2
0
6
@amelie_iska
amelie_schreiber
17 days
@RolandDunbrack Try Distributional Graphormer and NeuralPLexer2
2
1
6
@amelie_iska
amelie_schreiber
3 months
Computational efficiency in equivariant models is often a concern. This model addresses that and creates fast SE(n)-equivariant models for tasks involving molecules:
0
1
6
@amelie_iska
amelie_schreiber
27 days
For an alternative top down approach I highly recommend the research of @drmichaellevin
1
0
6
@amelie_iska
amelie_schreiber
7 months
Eeep! It's wooorkiiing! So excited! 😊 I'll write a hf blog post on it once it's all done.
0
1
6
@amelie_iska
amelie_schreiber
3 months
@samswoora You should also check out flow matching models. Flow matching generalizes diffusion (diffusion is a special case of flow matching). They're doing a lot with proteins and flow matching, but there's less buzz about it in vision and language domains.
2
0
5
@amelie_iska
amelie_schreiber
14 days
Definitely the vibe I’m going for 😂
@EugeneVinitsky
Eugene Vinitsky
16 days
summer student project presentations are incredible
Tweet media one
21
550
11K
0
0
5
@amelie_iska
amelie_schreiber
3 months
@maurice_weiler @erikverlinde @wellingmax Could someone recommend a similar resource for other architectures like equivariant transformers or equivariance in geometric GNN models? Just curious what the go to resources are for people for other architectures.
2
0
4
@amelie_iska
amelie_schreiber
7 months
Now, using persistence landscapes we can cut down clustering time from a full day to less than 30 minutes for 1000 proteins!
0
2
5
@amelie_iska
amelie_schreiber
6 months
@pratyusha_PS This is awesome. When will the code be available? I would love to try this with a protein language model like ESM-2 and see if it improves performance.
2
0
5
@amelie_iska
amelie_schreiber
3 months
Attempting to raise my signal to noise ratio today by making some quality posts about AI and biochemistry. 😊
0
0
4
@amelie_iska
amelie_schreiber
24 days
🤔 Based on this lecture (see 20:15), I think cancer is a biological “vector bundle” with degenerate or absent transition functions. In other words, local data dominates and the cohesive global structure does not exist. I wonder how far the analogy goes.
0
0
5
@amelie_iska
amelie_schreiber
2 months
@310ai__ It might also be good to look into computing the LIS score based on the PAE output of RoseTTAFold All Atom similar to what was done with AlphaFold-Multimeter here. This is a new approach for protein-small molecule complexes.
0
1
3
@amelie_iska
amelie_schreiber
3 months
Hot take for some, obvious to others: GPUs and LLM oriented ASICs along with AI operating systems will make CPUs mostly obsolete. Anyone out there capable of writing CUDA kernels who can explain why this might be an erroneous prediction?
2
0
4
@amelie_iska
amelie_schreiber
4 months
@GabGarrett CLIP but for proteins and small molecules...
1
0
4
@amelie_iska
amelie_schreiber
17 days
Pretty neat. How does it compare to a contrastive model like ProteinDT or ProteinCLIP? And could we use it for annotating in order to train a new ProteinDT or ProteinCLIP? Is there an exceptional text-guided diffusion model coming soon for proteins?
@ptshaw2
Pete Shaw
18 days
Excited to share new work from @GoogleDeepMind : “ProtEx: A Retrieval-Augmented Approach for Protein Function Prediction”
Tweet media one
3
41
153
1
0
3
@amelie_iska
amelie_schreiber
7 months
@andrewwhite01 You can also learn equivariance. I think equivariance is an overrated mathematical concept tbh. It's fancy and neat from a mathematical perspective, but otherwise I think you could have your network learn it and get just as far if not further.
0
0
4
@amelie_iska
amelie_schreiber
6 days
This is such a cute animation! 😎 Now do it for 3 modalities like ProTrek (text, protein sequence, protein structure)! 🧬
@ProfTomYeh
Tom Yeh | AI by Hand ✍️
8 days
[CLIP] by Hand ✍️ The CLIP (Contrastive Language–Image Pre-training) model, a groundbreaking work by OpenAI, redefines the intersection of computer vision and natural language processing. It is the basis of all the multi-modal foundation models we see today. How does CLIP work?
8
195
1K
0
1
5
@amelie_iska
amelie_schreiber
28 days
Anyone who understand the current state of the art in pharmacogenetic testing for determining drug efficacy and side effects, please reach out for discussion or to share some papers. I have questions, and *potentially* a few good ideas on how to improve this.
0
0
3
@amelie_iska
amelie_schreiber
1 month
@befcorreia @karla_mcastro @_JosephWatson @jueseph @UWproteindesign Curious to know why RFDiffusion motif scaffolding wasn’t tried here instead of or in addition to the RoseTTAFold constrained hallucination.
1
1
4
@amelie_iska
amelie_schreiber
7 months
Anyone have any idea why in silico directed evolution might increase perplexity and intrinsic dimension of a protein? Are more fit proteins generally more complicated?
3
0
4
@amelie_iska
amelie_schreiber
29 days
I think actually training this model could be done on Lambda Labs for around $150K (20 GPU days on 256 A100s) no? There is a difference between training and inference too that should be made clear here. Inference (using the model for predictions) is much cheaper than training.
@RolandDunbrack
Roland Dunbrack 🏳️‍🌈 @rolanddunbrack.bsky.social
29 days
Because downloadable code would let 100s of scientists put AlphaFold3 through its paces on different kinds of systems -- for benchmarking and for developing new protocols and code that has the ability to access input parameters (e.g. like colabfold does) missing from the server.
0
1
15
1
2
4
@amelie_iska
amelie_schreiber
14 days
Until everyone can code in natural language, we will not have enough “coders”.
0
0
4
@amelie_iska
amelie_schreiber
3 months
@HannesStaerk Still REALLY want to see this done with AlphaFold-Multimer. Maybe there’s a dynamic model of PAE and LIS that comes out of this that helps determine how strong or transient a PPI is.
1
2
4
@amelie_iska
amelie_schreiber
3 months
@biorxiv_bioinfo Cool idea, but how was the dataset split into train, test, and validation? Was sequence similarity/homology used to split the protein dataset? If not, this paper's results are unreliable. You have to split your data based on sequence similarity; 30% similarity is pretty standard
0
0
3
@amelie_iska
amelie_schreiber
3 months
AlphaFlow-Multimer with the appropriate generalization of the LIS score would more or less solve PPI prediction. LIS alone already mostly solves it. Then the only bottleneck for giant detailed PPI networks is compute. This is a big deal. Explain to me why I might be wrong.
0
0
4
@amelie_iska
amelie_schreiber
27 days
I love this. Thank you! Gotta go watch now! p(shittakes | Michael Levin) << p(shittakes | Amelie Schreiber) 😂 Also, @drmichaellevin …feel free to DM anytime with project ideas 🤓
@labenz
Nathan Labenz
27 days
@amelie_iska @drmichaellevin Another of my favorite episodes!
0
0
3
1
0
4
@amelie_iska
amelie_schreiber
1 month
@iScienceLuvr SVD initialization would’ve helped a lot.
@danielhanchen
Daniel Han
1 month
My take on "LoRA Learns Less and Forgets Less" 1) "MLP/All" did not include gate_proj. QKVO, up & down trained but not gate (pg 3 footnote) 2) Why does LoRA perform well on math and not code? lm_head & embed_tokens wasn't trained, so domain shifts not modelled. Also reason why
Tweet media one
7
123
574
0
0
3
@amelie_iska
amelie_schreiber
3 months
It would be very interesting and useful to see how this could be used in tandem with the following method for detecting binding sites of conformational ensembles of proteins using ESM-IF1:
0
0
3
@amelie_iska
amelie_schreiber
2 months
@ZymoSuperMan RFDiffusion works with structures, not sequences. For designing sequences that fold into the backbones that RFDiffusion generates you’ll need something like LigandMPNN which does allow for things like biasing particular residues in various ways to constrain the sequences designed
1
0
3
@amelie_iska
amelie_schreiber
16 days
After that, team up with @BakerLaboratory and make the best “RFDiffusion” and “LigandMPNN” anyone’s ever seen, but this time use continuous and discrete flow matching resp. and make it for all the biomolecules. 4 essential “foundation models” and we’ll be all set +/-ε 🎉😎🧬
0
0
3
@amelie_iska
amelie_schreiber
1 month
@lpachter For academic uses that don’t compete with Isomorphic Labs’ research… that part is subtle, but important. It means if you want to develop a new drug and have any hope of taking it to market, or if you’re not in academia, you’re out of luck. And no reproducing because patents! 😒
0
0
3
@amelie_iska
amelie_schreiber
2 months
Really cool channel. Maybe we’ll get a video on SE(3)-equivariant neural networks one day🤞This would be great for folks trying to understand new SOTA models for proteins and small molecules. I would totally be down to collaborate @mathemaniacyt 🧬
@mathemaniacyt
Mathemaniac
2 months
Why do we require Jacobi identity to be satisfied for a Lie bracket? In the process, we also understand intuitively why tr(AB) = tr(BA) without matrix components. Watch now:
Tweet media one
2
103
611
0
0
3