
vishal
@vishal_learner
Followers
764
Following
7K
Media
2K
Statuses
4K
https://t.co/tSqI2ObecS Maintainer. https://t.co/gFYvT1B8or community member. Will post about sports occasionally. #FlyEaglesFly
Joined August 2023
This is incredibly exciting and a tremendous opportunity for me. Studying ColBERT (both the papers and the repo) has been one of my greatest sources of joy this past year. Honored to be able to have an opportunity to contribute to the community!.
Welcome @vishal_learner as a new Maintainer of the ColBERT repo!. I’ve long loved Vishal’s ColBERT and PLAID deepdives here and on YouTube. Thanks to the many incredible folks who DM’ed.
7
4
71
RT @lateinteraction: “we don’t do evals, we just rely on reinforcement learning with verifiable rewards”.
0
11
0
Blazing-fast image creation – using just your voice. Try Grok Imagine.
315
637
4K
RT @charles_irl: I like making GPUs go brrt at @modal. I wrote up what I've learned along the way in an extension to the GPU Glossary -- o….
0
95
0
RT @BlancheMinerva: @giffmana My intuition for LLMs has always been that there's a minimum quality bar you want to pass but that once you'v….
0
3
0
TIL there are recommended tolerances for torch.allclose based on precision (this one is from bitsandbytes). I've been using defaults (atol = 1e-08, rtol = 1e-05) so will redo all my colbert-ai index artifact comparisons across PyTorch versions!! Excited to see how results change.
1
2
5
RT @vishal_learner: Just published a video where I do a step by step walkthrough of building this (simple) pipeline in DocWrangler locally.….
0
2
0
RT @vishal_learner: Just published a video where I do a step by step walkthrough of building a (simple) pipeline in DocWrangler locally. I….
0
1
0
RT @vishal_learner: New blog post explores why torch==2.7.1 → 2.8.0 breaks colbert-ai index reproducibility. Root cause: PyTorch versions h….
0
1
0
RT @vishal_learner: I have met my 2025 goal of publishing 50 ML blog posts 🎉📈.
0
4
0
RT @vishal_learner: Thanks for asking this as it gave me an opportunity to reflect. I couldn't pick just 1 so I picked my top 5. Naturally….
0
1
0
my favorite time of year is back. thank god.
0
0
2
Thanks for asking this as it gave me an opportunity to reflect. I couldn't pick just 1 so I picked my top 5. Naturally I published it as a blog post.
@vishal_learner Do you have one that is your favorite or one you are most proud of?.
1
1
2
RT @vishal_learner: @skylar_b_payne Thanks for asking this as it gave me an opportunity to reflect. I couldn't pick just 1 so I picked my t….
0
1
0
Is it something about the centroids tensor values? Probably not. I created random tensors in each torch version and compared them. It's only the normalized half precision tensor that diverges. Unlikely that I'll figure out why this happens, but documenting that it does. (4/4)
0
0
0