VictoriaLinML Profile Banner
Victoria X Lin Profile
Victoria X Lin

@VictoriaLinML

Followers
4K
Following
10K
Media
58
Statuses
1K

MTS @thinkymachines | MoMa/MoT🖼 • RA-DIT🔍 • Llama4🦙 🧵 https://t.co/j6QTac4SaT 🌴 Bay Area Ex: @AIatMeta @SFResearch • PhD @uwcse

San Francisco, CA
Joined December 2010
Don't wanna be here? Send us removal request.
@VictoriaLinML
Victoria X Lin
1 year
1/n Introducing MoMa 🖼, our new sparse early-fusion architecture for mixed-modal language modeling that significantly boosts pre-training efficiency 🚀 ( https://t.co/AmemA1SOM1). MoMa employs a mixture-of-expert (MoE) framework with modality-specific expert groups. Given any
8
51
306
@AkshatS07
Akshat Shrivastava
4 days
We’ve been integrating Isaac across the industry and have realized developers are missing a single platform for Physical AI – prompt engineering, deployment, and integration. Today we are excited to release Perceptron’s Platform - supporting our API - supporting chat
@perceptroninc
Perceptron AI
4 days
Perceptron’s platform is here — built for Physical AI Developers can now use Isaac-0.1 or Qwen3VL 235B via: Perceptron API — fast, reliable multimodal intelligence Python SDK — simple, grounded prompting for vision + language Build apps that see and understand the world.
2
3
14
@ArmenAgha
Armen Aghajanyan
4 days
While our models are baking, we've been building a platform purpose-built for physical AI—with perception at its core. Currently supports Isaac-0.1 and Qwen3VL-235B, with future model releases landing here too. We've unified task+output structures across perception models:
Tweet card summary image
github.com
The official Python SDK for the Perceptron API. Contribute to perceptron-ai-inc/perceptron development by creating an account on GitHub.
@perceptroninc
Perceptron AI
4 days
Perceptron’s platform is here — built for Physical AI Developers can now use Isaac-0.1 or Qwen3VL 235B via: Perceptron API — fast, reliable multimodal intelligence Python SDK — simple, grounded prompting for vision + language Build apps that see and understand the world.
5
13
81
@soumithchintala
Soumith Chintala
10 days
Leaving Meta and PyTorch I'm stepping down from PyTorch and leaving Meta on November 17th. tl;dr: Didn't want to be doing PyTorch forever, seemed like the perfect time to transition right after I got back from a long leave and the project built itself around me. Eleven years
501
581
11K
@srush_nlp
Sasha Rush
12 days
Think about this talk a lot. There was a time when people were bullish on "feed all the modalities to the LLM," but it didn't really pan out as I would have expected. The discrete / continuous divide remains a interesting challenge in deep learning.
@COLM_conf
Conference on Language Modeling
12 days
COLM Keynotes: Luke Zettlemoyer Mixed-modal Language Modeling https://t.co/8FdhhrfOnG
12
20
225
@VictoriaLinML
Victoria X Lin
12 days
🤞🤞
@_junaidkhalid1
JK
13 days
@liliyu_lili @thinkymachines Congrats on the move. The "kind, world-class team" part is often underestimated in these announcements. Technical ambition is common enough in AI right now.. but building something genuinely novel requires a team culture that can sustain deep collaboration without burning out.
0
0
4
@VictoriaLinML
Victoria X Lin
14 days
Very interesting read ☕ When poking different frontier models (e.g., GPT-5 vs Gemini), I’ve often noticed surprising similarity on non-STEM questions. This paper carefully quantified the “inter-model homogeneity” as part of their study — both in terms of embedding similarity and
@liweijianglw
Liwei Jiang
18 days
⚠️Different models. Same thoughts.⚠️ Today’s AI models converge into an 𝐀𝐫𝐭𝐢𝐟𝐢𝐜𝐢𝐚𝐥 𝐇𝐢𝐯𝐞𝐦𝐢𝐧𝐝 🐝, a striking case of mode collapse that persists even across heterogeneous ensembles. Our #neurips2025 𝐃&𝐁 𝐎𝐫𝐚𝐥 𝐩𝐚𝐩𝐞𝐫 (✨𝐭𝐨𝐩 𝟎.𝟑𝟓%✨) dives deep into
0
0
12
@thinkymachines
Thinking Machines
18 days
Today we’re announcing research and teaching grants for Tinker: credits for scholars and students to fine-tune and experiment with open-weight LLMs. Read more and apply at:
19
119
992
@universeinanegg
Ari Holtzman
20 days
I'm recruiting PhD students! I'm interested in: 1. Understanding how LLMs 'see' the world (ex: LMs can't see conspicious omissions, see AbsenceBench) 2. How can we make things with LLMs that have never been made before? (ex: Communnication Games, see 📌) 3. See my other posts :)
21
100
641
@tydsh
Yuandong Tian
25 days
Several of my team members + myself are impacted by this layoff today. Welcome to connect :)
474
280
7K
@syhw
Gabriel Synnaeve
1 month
This is an excellent history of LLMs, doesn't miss seminal papers I know. Reminds you we're standing on the shoulders of giants, and giants are still being born today.
12
115
693
@sivareddyg
Siva Reddy
1 month
Luke Zettlemoyer (@LukeZettlemoyer) plenary talk on scalable architectures for multimodal language modeling #COLM2025 Chameleon: autoregressive multimodal language models -- treat image as tokens -- works but harder to scale -- modality gap seems to be a big problem
2
13
117
@johnschulman2
John Schulman
2 months
Tinker provides an abstraction layer that is the right one for post-training R&D -- it's the infrastructure I've always wanted. I'm excited to see what people build with it. "Civilization advances by extending the number of important operations which we can perform without
@thinkymachines
Thinking Machines
2 months
Introducing Tinker: a flexible API for fine-tuning language models. Write training loops in Python on your laptop; we'll run them on distributed GPUs. Private beta starts today. We can't wait to see what researchers and developers build with cutting-edge open models!
49
115
1K
@sschoenholz
Sam Schoenholz
2 months
Tinker brings tools similar to the ones we use internally to the community. It provides a clean, transparent, abstraction that lets researchers write expressive experiments and training pipelines, while we manage the complexities of distributed training and sampling. We hope
@thinkymachines
Thinking Machines
2 months
Introducing Tinker: a flexible API for fine-tuning language models. Write training loops in Python on your laptop; we'll run them on distributed GPUs. Private beta starts today. We can't wait to see what researchers and developers build with cutting-edge open models!
4
14
129
@thinkymachines
Thinking Machines
2 months
Introducing Tinker: a flexible API for fine-tuning language models. Write training loops in Python on your laptop; we'll run them on distributed GPUs. Private beta starts today. We can't wait to see what researchers and developers build with cutting-edge open models!
227
792
6K
@thinkymachines
Thinking Machines
2 months
LoRA makes fine-tuning more accessible, but it's unclear how it compares to full fine-tuning. We find that the performance often matches closely---more often than you might expect. In our latest Connectionism post, we share our experimental results and recommendations for LoRA.
82
564
3K
@jaseweston
Jason Weston
3 months
...is today a good day for new paper posts? 🤖Learning to Reason for Factuality 🤖 📝: https://t.co/ss09xKGcAm - New reward func for GRPO training of long CoTs for *factuality* - Design stops reward hacking by favoring precision, detail AND quality - Improves base model across
1
50
386
@RulinShao
Rulin Shao
4 months
Happy to share that ReasonIR is accepted by @COLM_conf! Synthetic data & test-time scaling are powerful tools to enable new capabilities for challenging tasks. I’m impressed by how quickly smaller retrievers and better rerankers have been developed with ReasonIR data! #COLM2025
@RulinShao
Rulin Shao
7 months
Meet ReasonIR-8B✨the first retriever specifically trained for reasoning tasks! Our challenging synthetic training data unlocks SOTA scores on reasoning IR and RAG benchmarks. ReasonIR-8B ranks 1st on BRIGHT and outperforms search engine and retriever baselines on MMLU and GPQA🔥
2
15
133
@AkariAsai
Akari Asai
4 months
Some updates 🚨 I finished my Ph.D at @uwcse in June 2025! After a year at AI2 as a Research Scientist, I am joining CMU @LTIatCMU & @mldcmu (courtesy) as an Assistant Professor in Fall 2026. The journey, acknowledgments & recruiting in 🧵
122
63
1K
@VictoriaLinML
Victoria X Lin
2 months
Gorgeous building! Just learned that both the CDIS building at UW–Madison and the Bill & Melinda Gates Center at U Washington are by the same architects — @LMNArchitects. 🏨 UW-Madison: https://t.co/lPou0veRry 🏨 U Washington:
lmnarchitects.com
-2789
@SharonYixuanLi
Sharon Li
3 months
My students called the new CDIS building “state-of-the-art”. I thought they were exaggerating. Today I moved in and saw it for myself. Wow. Photos cannot capture the beauty of the design.
0
0
6