alishbaimran_ Profile Banner
Alishba Imran Profile
Alishba Imran

@alishbaimran_

Followers
7K
Following
8K
Media
164
Statuses
2K

CS @berkeley_eecs | curr: ML bio @arcinstitute, research @berkeley_ai | prev: @czbiohub, @tesla, @NVIDIA, founded ML battery startup

toronto
Joined November 2014
Don't wanna be here? Send us removal request.
@alishbaimran_
Alishba Imran
3 months
The AI for Robotics e-book is out now!🎉. 450 pages, 200+ visuals, 150K words covering perception, 3D sensor fusion, foundation models, transformers & diffusion for control, sim, RL & more. Available now:.Nature Springer: Amazon:
Tweet media one
Tweet media two
8
15
201
@alishbaimran_
Alishba Imran
2 days
RT @fleetwood___: Dropped the Virtual Cell Challenge Primer on HF. We are shipping transformers support for STATE (the SOTA model for pre….
0
8
0
@alishbaimran_
Alishba Imran
4 days
elegant approach for designing tf payloads for epigenetic reprogramming which enables in silico prediction of cell state changes from sparse combinatorial data:.
@jacobkimmel
Jacob Kimmel
5 days
reprogramming cells with transcription factors is our most expressive tool for engineering cell state. traditionally, we found TFs by ~guesswork. @icmlconf we're sharing @newlimit's SOTA AI models that can design reprogramming payloads by building on molecular foundation models
Tweet media one
0
1
5
@alishbaimran_
Alishba Imran
6 days
This review from the director of AI research at @microsoft perfectly captures why we wrote our book AI for Robotics!. “… reframing classic robotics challenges through a deep learning lens”. We always find reviews/feedback like this really helpful! 🙏
Tweet media one
0
0
8
@alishbaimran_
Alishba Imran
6 days
RT @KevinKaichuang: "The body of data available in protein sequences is something fundamentally new in biology and biochemistry, unpreceden….
0
6
0
@alishbaimran_
Alishba Imran
8 days
RT @niteshgarg03: Reading this book, gotta say liking it so far. Traditional robotics books (as important as they are for foundational know….
0
2
0
@alishbaimran_
Alishba Imran
16 days
RT @WholeMarsBlog: Some of the most exciting applications of deep learning are in biology. Really fascinating work.
0
6
0
@alishbaimran_
Alishba Imran
16 days
Work done with @edyoshikun, Soorya Pradeep, Ziwen Liu, @mattersOfLight & rest of the team @czbiohub!.
0
0
5
@alishbaimran_
Alishba Imran
16 days
Try it out:.- Code & training pipeline – - Annotation + visualization plugin – - Paper: (4/4).
1
0
4
@alishbaimran_
Alishba Imran
16 days
DynaCLR’s temporal embeddings enable in silico synchronization of asynchronous dynamics like infection, mitosis, and organelle remodeling. Using dynamic time warping, we estimate pseudotime across multiple imaging channels. (3/4)
1
0
3
@alishbaimran_
Alishba Imran
16 days
We use knowledge distillation to scale label-free infection phenotyping: a model trained on 10k fluorescence-labeled cells generates 133k pseudo-labels for phase data, cutting manual labeling 10Ă—. (2/4)
Tweet media one
1
0
3
@alishbaimran_
Alishba Imran
16 days
Excited to share our new paper on DynaCLR, a self-supervised model for learning cell dynamics from terabytes of live imaging!. Enables:.- Detection of infection & cell division.- Label-free prediction of states.- Alignment of async cell trajectories.- Organelle remodeling. (1/4)
Tweet media one
1
5
136
@alishbaimran_
Alishba Imran
16 days
RT @edyoshikun: Excited to share an update on #DynaCLR—a self-supervised method to learn dynamic cell & organelle embeddings from time-laps….
Tweet card summary image
arxiv.org
We report DynaCLR, a self-supervised method for embedding cell and organelle Dynamics via Contrastive Learning of Representations of time-lapse images. DynaCLR integrates single-cell tracking and...
0
10
0
@alishbaimran_
Alishba Imran
26 days
We’re also excited about this result!. Zero-shot prediction showed clear value of embeddings:.- Pretraining State on Tahoe-100M.- Fully fine-tuning on smaller, noisier datasets.- Led to more accurate perturbation ranking prediction than mean baselines or HVG-trained State models.
@ElliotHershberg
Elliot Hershberg
26 days
The result I'm most excited about from Arc's new State model:. The ability to generalize on zero-shot out-of-distribution predictions after pre-training on the TAHOE-100M data set. Whereas PLMs have seemingly benefitted less from scaling data and model size, this is an inkling
Tweet media one
1
5
34
@alishbaimran_
Alishba Imran
27 days
Following the launch of STATE, Arc is launching a new challenge!. Build ML models to predict how human cells respond to perturbations, using our new H1 human embryonic stem cell line data. Check it out:.
@davey_burke
Dave Burke
27 days
Announcing the inaugural Virtual Cell Challenge! Hosted by Arc Institute, and sponsored by Nvidia, 10x, and Ultima, help solve one of biology’s biggest challenges with AI by building cell state models that accurately predict responses to perturbation.
Tweet media one
0
0
13
@alishbaimran_
Alishba Imran
30 days
Learned a lot working on this with @abhinadduri, @yusufroohani, @davey_burke, @genophoria and rest of the team!.
0
1
7
@alishbaimran_
Alishba Imran
30 days
Excited to share what I’ve been working on @arcinstitute!. STATE is a transformer-based model trained to predict how cells respond to perturbations, using the largest single-cell perturbation dataset to date: 170M observational and 100M+ perturbational cells across 70 cell lines.
@pdhsu
Patrick Hsu
30 days
Today @arcinstitute releases State, our first perturbation prediction AI model and an important step towards our goal of a virtual cell. State is designed to learn how to shift cells between states (e.g. “diseased” to “healthy”) using drugs, cytokines, or genetic perturbations
Tweet media one
7
13
287
@alishbaimran_
Alishba Imran
30 days
RT @arcinstitute: Introducing Arc Institute’s first virtual cell model: STATE
Tweet media one
0
111
0
@alishbaimran_
Alishba Imran
1 month
RT @percyliang: Wrapped up Stanford CS336 (Language Models from Scratch), taught with an amazing team @tatsu_hashimoto @marcelroed @neilbba….
0
567
0