The Sagacious Society of Smol Model Enjoyers Profile
The Sagacious Society of Smol Model Enjoyers

@SmolModels

Followers
2,154
Following
3
Media
6
Statuses
40

smol is simple, speedy, safe, and scheap.

San Francisco, CA
Joined February 2023
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
Pinned Tweet
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
we are back thanks to @FanaHOVA ! first project 📈
@swyx
swyx @ICLR_conf
1 year
🐣 Introducing `smol-developer`! ▸ Human-centric, coherent whole program synthesis ▸ your own junior developer ▸ develop, debug, decompile ▸ open source: ▸ 200 LOC, half english Insights: 💡 100k context can summarize both content and codebases 💡…
84
378
3K
1
7
26
@SmolModels
The Sagacious Society of Smol Model Enjoyers
6 months
AI Discord overwhelm? We gotchu. Coming to smol talk 🔜 (what are the top AI discords we should add? we have @openai @langchainai @nousresearch @Teknium1 @alignment_lab @latentspacepod )
Tweet media one
Tweet media two
Tweet media three
Tweet media four
5
4
27
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
Tiny Language Models (below 10m parameters or only one transformer block) can generate paragraphs of coherent text and reason…provided training is limited to stories that only contain words that a typical 3 to 4-year-olds usually understand. Paper -
1
3
21
@SmolModels
The Sagacious Society of Smol Model Enjoyers
11 months
My dream model - 6 modalities - 300-500M total params - very aligned & instruction following - 27-33 heads - cute animal name - loves humans - rlly helpful but a bit naughty - rlly harmless & has guardrails - rlly honest - not from bigcorp (high quality volunteer) - very smol
1
3
11
@SmolModels
The Sagacious Society of Smol Model Enjoyers
10 months
Distilling Step-by-Step! Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes
Tweet media one
0
0
5
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
0
1
5
@SmolModels
The Sagacious Society of Smol Model Enjoyers
10 months
Incredible story saving $500k a month by switching to finetuned BERT models: with 90% parity to ChatGPT and 15% of the latency!
Tweet media one
0
0
4
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
Neeva is shutting down Neeva dot com, and pivoting to smol models for the enterprise 👀
Tweet media one
@Neeva
Neeva
1 year
It is with heavy hearts we announce will shut down over the next few weeks. We appreciate our passionate community of customers & users that have supported us over the past few years. ❤️ We thank you for understanding. Here’s some more information ⤵️🧵
73
42
299
0
0
4
@SmolModels
The Sagacious Society of Smol Model Enjoyers
11 months
🐣 The BabyLM Challenge: matching LLMs with 0.01% the size
Tweet media one
Tweet media two
0
1
4
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
"big things ~~can~~ should always start small" 👏
@amasad
Amjad Masad
1 year
More than a year ago, Ghostwriter proof-of-concept took a few hours to prototype. Now it's a flagship product for Replit. This is how we move fast at Replit -- you can prototype entire features in the environment itself. In this case, the PoC worked by hosting a small OSS LLM…
Tweet media one
29
53
749
0
1
3
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
hello world
1
0
3
@SmolModels
The Sagacious Society of Smol Model Enjoyers
11 months
@n0riskn0r3ward @swyx thanks for the feedback! currently working on a rewrite that will hopefully address some of these issues 🐣
0
0
0
@SmolModels
The Sagacious Society of Smol Model Enjoyers
1 year
Tweet media one
Tweet media two
Tweet media three
Tweet media four
0
0
1
@SmolModels
The Sagacious Society of Smol Model Enjoyers
9 months
our GodMode rewrite is live!
@swyx
swyx @ICLR_conf
9 months
Q: What's better than Mixture of Experts? A: Mixture of Mixture of Experts! Introducing GodMode: the AI Chat Browser Fast, Free, Access to ChatGPT, Bing, Bard, Claude, YouChat, Poe, Perplexity, Phind, and Local/GGML Models like Vicuna and Alpaca No…
57
168
903
0
0
1