Colin Kealty
@bartowski1182
Followers
3K
Following
1K
Media
21
Statuses
762
LLM Enthusiast https://t.co/FadJBzEsVw https://t.co/uJHolerL9G https://t.co/9JIEKgsIMh https://t.co/lYSGzQBmuP
Joined February 2024
π€ @huggingface's most impactful contributors you should follow for the month of October is out! ππ https://t.co/0k4322OZO8 ποΈ Top 10 Model likes: @TheBlokeAI, #mradermacher, @bartowski1182, #lllyasviel, #hexgrad, #nitrosocke, @kijaidesign, @haruu1367, @prompthero,
6
10
32
Now in effect: Mergekit has been re-licensed under GNU LGPL v3, restoring clarity and flexibility for users and contributors. Read more about our decision in the blog. https://t.co/dGAMWK2V1r
arcee.ai
Effective Friday, October 31, 2025, we are returning Mergekit to the GNU Lesser General Public License v3.
4
4
32
ArceeAI is on Discord! Join for early access to some exciting drops!
3
8
37
It finally happened, one of my main fears with running my setup from home, my modem died during the night :') Thankfully tech support was very quick to accept it's defective hardware, but no stores open for another 3 hours so.. yeah, GLM 4.6 uploads will continue shortly π
3
0
8
Small collab with the folks over at @LinusTech on an article :) https://t.co/vnnCputHkh They did most of the work, just helped with a few details, really great stuff from them!
lttlabs.com
Welcome to LTT Labs - your go-to destination for all things tech. Explore comprehensive test results, insightful commentary, and the latest analysis in hardware.
2
0
5
Weβre going permissive: Apache 2.0 across the board. AFM-4.5B is now relicensed from Arcee to Apache 2.0; the agent variant will launch under Apache 2.0; and all upcoming releases ship with open weights. Three models are in training.
21
38
198
Meet Jan-v1-edge: an experimental 1.7B distilled model for Perplexity-style search. Jan-v1-edge is our lightweight distillation experiment, derived from Jan v1. We're testing how well web search and reasoning can transfer into a smaller 1.7B parameter model that runs on edge
16
44
419
The next major question is.. Fedora or Ubuntu? I've been using Ubuntu my whole life but largely because that's what I started with - no major issues, but curious if it's worth trying something else I also plan to use this as an actual desktop for the first time in like a decade
34
0
14
Huge shout-out to @FrameworkPuter for both creating and sending me such an awesome piece of engineering, looking forward to playing with this :D β€οΈ
2
5
148
Wednesday night activities, setting up the new server π
5
1
28
So proud of the team on this one, was a really great effort, huge coordination from the team to release an awesome model. Super excited about this and the future of @arcee_ai !
Today, weβre officially releasing the weights for AFM-4.5B and AFM-4.5B-Base on HuggingFace. This is a major milestone for @arcee_ai. AFM is designed to be flexible and high-performing across a wide range of deployment environments.
2
2
25
Trying to talk myself into buying a new CPU/Mobo/RAM combo (epyc 9534+, 576-768GB).. why is RAM more than half the cost?! Looking at ~2500-3000 USD even on eBay π
3
0
14
Gonna be chatting with @CloudDude_ and @_EldarKurtic about quantization and LLMs in general, going live in 20 minutes, come say hi :)
ποΈ Join ourπ΄ππππ‘ πππππ¬ ππ’π―π Show! ποΈ Thursday 17th 11:30 AM EDT π― A chill livestream unpacking LLM #Quantization: @vllm_project vs @ollama the What & How. π₯Dope guest stars: @bartowski1182 & @_EldarKurtic πStream on YouTube & Twitter:
1
1
7
As generative AI becomes increasingly central to business applications, the cost, complexity, and privacy concerns associated with language models are becoming significant. At @arcee_ai, weβve been asking a critical question: Can CPUs actually handle the demands of language
1
3
23
Man, didn't think I'd feel the loss of /r/localllama this hard π
hope it gets new moderation soon!!
7
3
47
this release is pure class. arcee using their data to do some short-term continued pretraining on GLM 32b. long context support has gone from effectively 8k -> 32k, and all base model evaluations (including short context ones) have improved
2
22
198
Last week, we launched AFM-4.5B, our first foundation model. In this post by @chargoddard , you will learn how we extended the context length of AFM-4.5B from 4k to 64k context through aggressive experimentation, model merging, distillation, and a concerning amount of soup. Bon
6
33
188
Today, weβre thrilled to unveil the @arcee_ai Foundation Models, a new family of GenAI models designed from the ground up for enterprise reality. The first releaseβAFM-4.5Bβis a 4.5-billion-parameter frontier model that delivers excellent accuracy, strict compliance, and very
5
25
94