
Cohere Labs
@Cohere_Labs
Followers
22K
Following
3K
Media
2K
Statuses
4K
@Cohere's research lab and open science initiative that seeks to solve complex machine learning problems. Join us in exploring the unknown, together.
Earth
Joined April 2019
This week we'll dive into the paper "REWARDBENCH 2: Advancing Reward Model Evaluation" (. Learn more and register now: Thanks to @asusevski and @alvanlii for organizing these events! 👏.
0
0
3
RT @posi_olomo: Day 1.1st class: ML Math Refresher.Speaker: Katrina Lawrence. We got a refresher on the following topics:.- Derivates.- Par….
0
14
0
Don't forget to join us tomorrow, July 3rd as we host @tonysilveti for a session on "Training neural networks at any scale". Learn more:
Join our ML Theory group next week as they welcome @tonysilveti on July 3rd for a presentation on "Training neural networks at any scale". Thanks to @itsmaddox_j @aniervs and @ThangChu77 for organizing this session 👏. Learn more:
0
5
19
RT @itsmaddox_j: An absolute privilege to be involved with all the fantastic individuals at the @Cohere_Labs community! This is your sign t….
0
2
0
🌟This month we are spotlighting @itsmaddox_j, one of our leads for the ML Theory Program! Andrej works as a research assistant with CaMLSys group at the University of Cambridge and focuses on compression/efficiency within large-scale distributed (federated) optimisation.
0
0
2
Our community-led Computer Vision group is looking forward to welcoming @Boyiliee, Research Scientist at NVIDIA, next week on July 8th for a session on "Scaling Vision Pre-Training to 4K Resolution". Thanks to @cataluna84 and @Arkhymadhe for organizing this event 👏
1
3
33
Don't forget! Tomorrow, July 2nd as we'll host @black_samorez for a session on "Quartet: Native FP4 training can be optimal for large language models". 📅 Learn more:
Join us on July 2nd as we welcome back @black_samorez for another insightful session! This time he'll present "Quartet: Native FP4 training can be optimal for large language models". Thanks to @Sree_Harsha_N , @BhavnickMinhas & @srishti_gureja for organizing this session 🤩
2
2
20
Scaling Self-Supervised Learning for Vision🧠🖼️. @TimDarcet will present on July 6th and he'll cover contrastive learning, masked image modeling, and how DINOv2 trains powerful models without labeled data. Learn more:
0
2
12
Intro to Transformers & LLMs! 🤖. On July 3rd, @SidYaeger will lead us as we explore attention, positional encoding, and model scaling—key ideas behind transformer models. A solid foundation for understanding how today’s LLMs are built. Learn more:
1
2
10