Contextual AI Profile Banner
Contextual AI Profile
Contextual AI

@ContextualAI

Followers
2,272
Following
18
Media
12
Statuses
44

Enterprise LLMs

Bay Area
Joined March 2023
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
Pinned Tweet
@ContextualAI
Contextual AI
1 month
Today, we’re excited to announce RAG 2.0, our end-to-end system for developing production-grade AI. Using RAG 2.0, we’ve created Contextual Language Models (CLMs), which achieve state-of-the-art performance on a variety of industry benchmarks. CLMs outperform strong RAG…
Tweet media one
35
140
1K
@ContextualAI
Contextual AI
2 months
Introducing GRIT: one single language model that is instruction-tuned to be good at both representation and generation, at the same time. GritLM is open state of the art on *both* generative and embedding benchmarks + big speedups in RAG. Read more here:
Tweet media one
3
9
46
@ContextualAI
Contextual AI
1 month
With RAG 2.0, the generator and retriever are always working together. Whether you're building a house or enterprise-grade AI, teamwork makes the dream work
Tweet media one
2
7
39
@ContextualAI
Contextual AI
6 months
Correct licensing and attribution is critical when building LLMs for enterprise customers. Here at Contextual we care a lot about these issues and are glad to share our findings with the community, via research like this:
@Muennighoff
Niklas Muennighoff
6 months
We analyze 1800+ datasets across licenses & metadata to help navigate the increasing complexity of instruction data ☸⛴️ CommitPackFT & xP3x are the only instruction datasets with >>50 langs & very permissive⭐ 📜 A huge effort led by @ShayneRedford 🚀❤️
Tweet media one
1
28
106
0
7
37
@ContextualAI
Contextual AI
5 months
Congratulations to our very own @Muennighoff and the other awardees for being #NeurIPS2023 outstanding paper runners-up with their work on scaling data-constrained language models!
@NeurIPSConf
NeurIPS Conference
5 months
**Outstanding Main Track Runner-Ups** Scaling Data-Constrained Language Models Direct Preference Optimization: Your Language Model is Secretly a Reward Model
1
11
76
0
7
27
@ContextualAI
Contextual AI
1 month
A typical RAG system today stitches together a frozen off-the-shelf model for embeddings, vector database for retrieval, and a black-box language model for generation. The components technically work, but the whole is suboptimal — and far from production-grade. RAG 2.0…
2
1
26
@ContextualAI
Contextual AI
1 month
Our first set of RAG 2.0 models, Contextual Language Models (CLMs), significantly improve performance over current systems across axes critical for enterprise work: open-domain question answering, faithfulness, and freshness.
Tweet media one
1
2
26
@ContextualAI
Contextual AI
1 month
We're at #GTC24 ! Join us on March 20th to hear @apsdehal share how we developed Kahneman Tversky Optimization (KTO) to speed up the #LLM and human feedback loop for #generativeai
Tweet media one
0
7
24
@ContextualAI
Contextual AI
1 month
Our CEO @douwekiela spoke at @saastr 's AI Day yesterday - want to know what it takes to build AI products for the enterprise? Watch the recording here:
1
3
23
@ContextualAI
Contextual AI
8 months
It was great to see Contextual AI in @ThomasOrTK 's keynote at #GoogleCloudNext . We’re excited to partner and build the next generation of language models.
Tweet media one
1
7
21
@ContextualAI
Contextual AI
5 months
We're excited to share our work on better, faster and cheaper alignment of large language models with the broader community:
@winniethexu
Winnie Xu
5 months
At Contextual AI, one of the biggest pain points for our customers doing LLM alignment is getting the preference data that current methods need. Think about all the pain you’ve been through trying to collect + label training data — now imagine doing that at 100x the scale. 🧵 1/
Tweet media one
5
22
135
1
3
21
@ContextualAI
Contextual AI
6 months
🚀 Exciting Job Opportunities! We're searching for our next Product Engineers and Solutions Engineers to be at the forefront of AI. We're on a mission to create groundbreaking AI solutions, and we want you to be a key part of our journey. Apply here:
1
7
17
@ContextualAI
Contextual AI
8 months
Welcome to the team @StasBekman !
@StasBekman
Stas Bekman
8 months
I'm super excited to start working at @contextualai where I will be training LLMs w/ Retrieval to help businesses deploy AI that overcomes hallucination, keeps data up-to-date and runs much faster inference. If you're new to , see: …
5
6
127
1
2
19
@ContextualAI
Contextual AI
2 months
Contextual AI leverages @googlecloud GKE Autopilot for our retrieval augmented language model technology, optimized for enterprise workflows. Discover how #GKE streamlines operations, enhances performance, and reduces costs for AI applications:
0
5
18
@ContextualAI
Contextual AI
8 months
We’re building the next generation of language models for enterprise customers. Announcing our partnership with Google Cloud to build our Contextual Language Models (CLMs):
Tweet media one
0
5
15
@ContextualAI
Contextual AI
8 months
And we got to hit a gong, as loud as possible!
0
2
15
@ContextualAI
Contextual AI
10 months
Announcing LENS 🔎, a framework for vision-augmented language models, making language models see. Read more: Demo:
Tweet media one
@w33lliam
William Berrios
10 months
Announcing LENS 🔎, a framework for vision-augmented language models. - Outperforms Flamingo by 9% (56->65%) on VQAv2 - Eliminates the additional cost of multimodal pre-training Demo: Blog+Paper+Code: A 🧵 [1/N]
2
42
192
1
1
16
@ContextualAI
Contextual AI
7 months
We're proud to be a 2023 #IA40 Intelligent Applications Rising Star winner!
@MadronaVentures
Madrona
7 months
We are excited to unveil the 2023 #IA40 — the top private companies building & enabling intelligent & generative apps today. We'll celebrate the winners on Oct. 11 at the #IASummit in partnership w/ @Microsoft , @AWSstartups , @NYSE . @McKinsey , & @PitchBook !
1
9
22
0
7
15
@ContextualAI
Contextual AI
3 months
Looking to get most out of your hardware when training models? Take a look at this new report from our very own @StasBekman .
@StasBekman
Stas Bekman
3 months
This is a long overdue paper that we have started discussing back when training BLOOM-176. Basically this paper tells you how to design your model's dimensions for an optimal training throughput. Fantastic! Yours truly contributed the SwiGLU section ;)
2
15
146
0
6
15
@ContextualAI
Contextual AI
2 months
Selecting the right data is critical for LLM performance across all stages of training. This recent paper surveys data selection, and shows that there is much more exciting research to be done.
@AlbalakAlon
Alon Albalak
2 months
{UCSB|AI2|UW|Stanford|MIT|UofT|Vector|Contextual AI} present a survey on🔎Data Selection for LLMs🔍 Training data is a closely guarded secret in industry🤫with this work we narrow the knowledge gap, advocating for open, responsible, collaborative progress
Tweet media one
9
77
305
0
5
15
@ContextualAI
Contextual AI
10 months
We're thrilled to be featured on the @CBinsights #AI100 as one of the "most promising AI startups " in the "Foundation models & APIs" category! 🚀
@CBinsights
CB Insights
10 months
Boom: The most promising AI companies around the world. This year's winners include startups working on generative AI infrastructure, emotion analytics, general-purpose humanoids, and more. #AI100
4
27
69
0
7
13
@ContextualAI
Contextual AI
1 month
CLMs achieve even bigger gains over current approaches when applied to real world data, as we have seen with our early customers. We see this in finance (see FinanceBench below as proxy), as well as in other highly specialized domains like law and hardware engineering.
Tweet media one
1
1
14
@ContextualAI
Contextual AI
1 month
We’re thrilled about the results we’re already seeing with RAG 2.0 and can’t wait to bring it to more leading enterprises. F500s and unicorns are already building on RAG 2.0 today, leveraging our CLMs and latest fine-tuning and alignment techniques to deploy generative AI they…
0
1
12
@ContextualAI
Contextual AI
28 days
Join us on April 10 as we take part in #GoogleCloudNext . Our CEO @douwekiela will dive into retrieval-augmented generation (RAG), which he pioneered at Facebook, and share how Contextual AI's RAG 2.0 approach is key to #generativeAI deployment in the #enterprise . Register to join…
Tweet media one
1
3
13
@ContextualAI
Contextual AI
26 days
We have the new way to build #enterpriseai with RAG 2.0. Our CTO @apsdehal will be sharing how we accelerate #AI training workloads with @GoogleCloudNext tech. Join the discussion on April 10 →
Tweet media one
0
2
11
@ContextualAI
Contextual AI
1 month
Find out how to make the dream work at
0
0
7
@ContextualAI
Contextual AI
2 months
It’s all open source! The paper () provides detailed ablations. Effort led by the amazing @Muennighoff , in collaboration with @Microsoft @MSFTResearch and @HKUniversity .
@Muennighoff
Niklas Muennighoff
2 months
Introducing GRIT🦾to unify text embedding 🔢& generation 📝. GritLM is open SoTA on embedding (MTEB) & generative tasks (BBH etc.) – Both in 1 model. See 🧵for how GRIT🦾 makes RAG >60% faster & more 📜 💻 1/12
Tweet media one
10
141
575
0
0
5