arcee_ai Profile Banner
Arcee.ai Profile
Arcee.ai

@arcee_ai

Followers
4K
Following
2K
Media
283
Statuses
655

Optimize cost & performance with AI platforms powered by our industry-leading SLMs: Arcee Conductor for model routing, & Arcee Orchestra for agentic workflows.

San Francisco
Joined September 2023
Don't wanna be here? Send us removal request.
@arcee_ai
Arcee.ai
11 days
We knew it all along, but it's great to get user validation: through real-user usage and evaluation, several of our small models—Maestro, Coder, and AFM-4.5B-Preview—are topping the charts on @yupp_ai , the only platform that determines AI quality through real-user pairwise
Tweet media one
1
3
27
@arcee_ai
Arcee.ai
20 days
As generative AI becomes increasingly central to business applications, the cost, complexity, and privacy concerns associated with language models are becoming significant. At @arcee_ai, we’ve been asking a critical question: Can CPUs actually handle the demands of language
Tweet media one
1
3
21
@arcee_ai
Arcee.ai
25 days
RT @julsimon: In this fun demonstration, you can witness the impressive capabilities of @arcee_ai AFM-4.5B-Preview, Arcee's first foundatio….
0
3
0
@arcee_ai
Arcee.ai
25 days
RT @julsimon: In this new video, I introduce two new research-oriented models that @arcee_ai recently released on @huggingface Face. Homu….
0
1
0
@arcee_ai
Arcee.ai
25 days
RT @julsimon: In this video, I introduce and demonstrate three production-grade models that @arcee_ai recently opened and released on @hu….
0
2
0
@arcee_ai
Arcee.ai
26 days
Save the date! A week from now, please join us live to discover how @Zerve_AI is leveraging our model routing solution, Arcee Conductor, to improve its agentic platform for data science workflows. This should be a super interesting discussion, and of course, we'll do demos!
Tweet media one
1
1
7
@arcee_ai
Arcee.ai
29 days
Today, we're happy to announce the open-weights release of five language models, including three enterprise-grade production models that have been powering customer workloads through our SaaS platform (SuperNova, Virtuoso-Large, Caller), and two cutting-edge research models
Tweet media one
3
16
98
@arcee_ai
Arcee.ai
1 month
Today, we're excited to announce the integration of @arcee_ai Conductor, our SLM/LLM model routing solution, into the @Zerve_AI platform, an agent-driven operating system for Data & AI teams 😃. This collaboration enables data scientists, engineers, and AI developers to build,
Tweet media one
2
2
16
@arcee_ai
Arcee.ai
1 month
We’re beyond thrilled to share that Arcee AI Conductor has been named “LLM Application of the Year” at the 2025 AI Breakthrough Awards. This recognition isn’t just a shiny badge—it’s a celebration of a vision we’ve been chasing for years: making AI smarter, more accessible, and
Tweet media one
0
6
13
@arcee_ai
Arcee.ai
1 month
RT @gm8xx8: “First of many blogs” from Arcee. AFM-4.5B scaled from 4K → 64K context. ⮕ 𝑱𝑼𝑺𝑻 𝑴𝑬𝑹𝑮𝑬, 𝑫𝑰𝑺𝑻𝑰𝑳𝑳, 𝑹𝑬𝑷𝑬𝑨….
0
6
0
@arcee_ai
Arcee.ai
1 month
In this post, Mariam Jabara, one of our Field Engineers, walks you through three real-life use cases for model merging, recently published in research papers:. ➡️ Model Merging in Pre-training of Large Language Models.➡️ PatientDx: Merging Large Language Models for Protecting
Tweet media one
0
4
13
@arcee_ai
Arcee.ai
1 month
RT @sjoshi804: Congratulations to the @datologyai team on powering the data for AFM-4B by @arcee_ai - competitive with Qwen3 - using way wa….
0
1
0
@arcee_ai
Arcee.ai
1 month
RT @arcee_ai: Last week, we launched AFM-4.5B, our first foundation model. In this post by @chargoddard , you will learn how we extended t….
0
33
0
@arcee_ai
Arcee.ai
1 month
RT @kalomaze: this release is pure class. arcee using their data to do some short-term continued pretraining on GLM 32b. long context suppo….
0
22
0
@arcee_ai
Arcee.ai
1 month
RT @teortaxesTex: You can't overstate how impressive this is. Arcee took one of the strongest base models, GLM-4, a product of many years o….
0
26
0
@arcee_ai
Arcee.ai
1 month
Last week, we launched AFM-4.5B, our first foundation model. In this post by @chargoddard , you will learn how we extended the context length of AFM-4.5B from 4k to 64k context through aggressive experimentation, model merging, distillation, and a concerning amount of soup. Bon
Tweet media one
6
33
190
@arcee_ai
Arcee.ai
1 month
RT @leavittron: "Pretraining is dead" is dead
Tweet media one
0
11
0
@arcee_ai
Arcee.ai
1 month
RT @arcee_ai: Our first foundation model, AFM-4.5B, is not even 24 hours old, and our users are already going wild. "Don't sleep on Arcee"….
0
5
0
@arcee_ai
Arcee.ai
1 month
Our first foundation model, AFM-4.5B, is not even 24 hours old, and our users are already going wild. "Don't sleep on Arcee" seems to be the motto. We love that, because we haven't slept much lately 😃. You can try the model in our playground ( and on
Tweet media one
Tweet media two
Tweet media three
0
5
18