Zico Kolter Profile
Zico Kolter

@zicokolter

Followers
23K
Following
815
Media
38
Statuses
637

Professor and Head of Machine Learning Department at @CarnegieMellon. Board member @OpenAI. Chief Technical Advisor @GraySwanAI. Chief Expert @BoschGlobal.

Pittsburgh, PA
Joined March 2017
Don't wanna be here? Send us removal request.
@zicokolter
Zico Kolter
10 days
RT @Eric_Wallace_: Today we release gpt-oss-120b and gpt-oss-20b—two open-weight LLMs that deliver strong performance and agentic tool use.….
0
384
0
@zicokolter
Zico Kolter
10 days
RT @JoHeidecke: Open models can unlock huge benefits, and like any powerful technology, they carry misuse risks. Once the weights are relea….
0
4
0
@grok
Grok
2 days
Blazing-fast image creation – using just your voice. Try Grok Imagine.
43
54
395
@zicokolter
Zico Kolter
15 days
RT @aran_nayebi: 1/ Updated now with nearly tight lower bounds—i.e., proofs showing when alignment becomes intractable, even for ideal agen….
0
2
0
@zicokolter
Zico Kolter
17 days
RT @andyzou_jiaming: We deployed 44 AI agents and offered the internet $170K to attack them. 1.8M attempts, 62K breaches, including data l….
0
398
0
@zicokolter
Zico Kolter
2 months
RT @yidingjiang: A mental model I find useful: all data acquisition (web scrapes, synthetic data, RL rollouts, etc.) is really an explorati….
yidingjiang.github.io
This post explores the idea that the next breakthroughs in AI may hinge more on how we collect experience through exploration, and less on how many parameters and data points we have.
0
58
0
@zicokolter
Zico Kolter
2 months
RT @maksym_andr: 🚨Excited to release OS-Harm! 🚨. The safety of computer use agents has been largely overlooked. We created a new safety b….
0
28
0
@zicokolter
Zico Kolter
2 months
RT @_vaishnavh: Wrote my first blog post! I wanted to share a powerful yet under-recognized way to develop emotional maturity as a research….
0
14
0
@zicokolter
Zico Kolter
2 months
RT @YixuanEvenXu: ✨ Did you know that NOT using all generated rollouts in GRPO can boost your reasoning LLM? Meet PODS! We down-sample roll….
0
15
0
@zicokolter
Zico Kolter
3 months
RT @haok1402: Introducing FLAME-MoE: a fully open platform for Mixture-of-Experts (MoE) research. All code, data, checkpoints, training log….
0
21
0
@zicokolter
Zico Kolter
3 months
RT @ZhengyangGeng: Excited to share our work with my amazing collaborators, @Goodeat258, @SimulatedAnneal, @zicokolter, and Kaiming. In a….
0
39
0
@zicokolter
Zico Kolter
3 months
RT @pratyushmaini: Excited to be talking today about how research into memorization provides a fundamentally different lens on safety!.
0
9
0
@zicokolter
Zico Kolter
3 months
RT @RuntianZhai: A shorter version of the first three chapters of my thesis is accepted by ICML 2025. It provides a quick start for those i….
Tweet card summary image
arxiv.org
Despite the empirical success of foundation models, we do not have a systematic characterization of the representations that these models learn. In this paper, we establish the contexture theory....
0
2
0
@zicokolter
Zico Kolter
4 months
RT @pratyushmaini: Looking forward to giving a talk this Friday @OpenAI with @zhilifeng on some of our privacy & memorization research + ho….
0
12
0
@zicokolter
Zico Kolter
4 months
RT @electronickale: ✨ Love 4o-style image generation but prefer to use Midjourney? Tired of manual prompt crafting from inspo images?. PRIS….
0
31
0
@zicokolter
Zico Kolter
4 months
RT @_christinabaek: When we train models to do QA, are we robustly improving context dependency? No!. In our ICLR Oral (Fri 11 AM), we show….
0
20
0
@zicokolter
Zico Kolter
4 months
RT @kchonyc: spicy @zicokolter
Tweet media one
0
9
0
@zicokolter
Zico Kolter
4 months
Arxiv: Website: Talk: Also with @zhilifeng @A_v_i__S @AlexRobey23 @m_finzi.
1
4
22
@zicokolter
Zico Kolter
4 months
Excited about this work with @ashertrockman @yashsavani_ (and others) on antidistillation sampling. It uses a nifty trick to efficiently generate samples that makes student models _worse_ when you train on samples. I spoke about it at Simons this past week. Links below.
Tweet media one
7
19
161