Mattt
@mattt
Followers
34K
Following
3K
Media
18
Statuses
58
Collaborating w/@huggingface and writing on @nshipster. Prev: @replicate, @github, @apple, BA @CarnegieMellon
Portland, OR
Joined December 2006
I'm thrilled to announce my collaboration with @huggingface to help developers bring AI directly to users — on their own devices, on their own terms 🤗 We'll be working together to build tools & Swift packages to make things better, and writing guides that make things clearer.
13
16
184
🎉 llama.cpp now has Ollama-style model management. • Auto-discover GGUFs from cache • Load on first request • Each model runs in its own process • Route by `model` (OpenAI-compatible API) • LRU unload at `--models-max` https://t.co/yfmfHL7zzj
huggingface.co
16
57
415
In my collaboration with @huggingface, I'm on a mission to make it easier to build apps that leverage local, open-source AI models. So much of this has been community-driven, and I really appreciate the feedback I've received so far. What would you like to see next?
2
0
8
I built this package in response to feedback we received about the HubAPI implementation in swift-transformers. Try swift-huggingface today, and stay tuned for its integration into swift-transformers in an upcoming release. https://t.co/v5yaB36NFf
github.com
A Swift client for Hugging Face Hub and Inference Providers APIs - huggingface/swift-huggingface
2
0
8
Introducing swift-huggingface: a complete Swift client for @huggingface Hub. • Fast, resumable downloads • Flexible, predictable auth • Sharable cache with Python / hf CLI • Xet support (coming soon!) https://t.co/GyecsGnPAf
huggingface.co
6
52
394
Just published an article on @huggingface's blog about AnyLanguageModel that goes into more detail about the motivation behind the library and what we're building toward: https://t.co/ctRmsaT8b6 🆕 Be sure to check out our new chat-ui-swift example project (linked at the end)
huggingface.co
3
33
141
Forgot to mention — AnyLanguageModel now officially supports Linux, making it a great fit for anyone running web applications in Swift.
1
0
9
Indulge yourself with class, style, and sophistication. Dress to impress this holiday season with Gentlemen's Guru. Shop the latest styles in men's formal wear and accessories for the modern gentleman. Get assistance from our Experts Award Winning Brand
0
12
137
(And don't worry — I didn't forget about doc improvements for token authentication patterns. Look for that next week.)
0
0
0
Thanks to Noor Bhatia ( https://t.co/8474n0hzco) for their requesting this feature and helping to implement support for MLX 🙌
github.com
noorbhatia has 21 repositories available. Follow their code on GitHub.
2
0
6
So far AnyLanguageModel has aimed for 1:1 compatibility with the Foundation Models framework API. However, since that doesn't (yet) support image inputs, we had to go off-book and design our own API. If and when the framework adds support, we'll migrate to that and provide clear
1
0
4
AnyLanguageModel 0.4.0 is out with support for multi-modal inputs. Vision language models are incredibly useful — extract text from receipts, analyze diagrams, describe images for accessibility, answer questions about photos, and much more. You can now pass images directly to
Introducing AnyLanguageModel: A Swift package that provides a drop-in replacement for Apple's Foundation Models framework with support for custom language model providers. https://t.co/ZU8XvaVHwK Just change your import statement:
7
9
86
Thanks to everyone who's checked out AnyLanguageModel so far! I just cut a new 0.3.0 release with a Gemini adapter that supports Google search grounding, code evaluation, and more: https://t.co/v4A6EuORsN Next up — Documenting best practices for managing API credentials in your
Introducing AnyLanguageModel: A Swift package that provides a drop-in replacement for Apple's Foundation Models framework with support for custom language model providers. https://t.co/ZU8XvaVHwK Just change your import statement:
9
11
147
I'm building AnyLanguageModel in collaboration with @huggingface to solve a major pain point for AI app developers: Most apps use some mix of local & remote models from different providers, and it's annoying to get them all to play nicely together. Apple's Foundation Models
3
2
36
AnyLanguageModel currently supports the following providers: ☑️Apple Foundation Models ☑️Core ML ☑️MLX ☑️llama.cpp (GGUF) ☑️Ollama ☑️OpenAI API ☑️Anthropic API Mix and match, tinker and trial — all with a single API.
1
1
28
Introducing AnyLanguageModel: A Swift package that provides a drop-in replacement for Apple's Foundation Models framework with support for custom language model providers. https://t.co/ZU8XvaVHwK Just change your import statement:
27
59
410
Also, we hit a fun milestone — https://t.co/pb1UMcV8mC now has a few thousand daily users just a month after launch Thanks to everyone who's made it part of your development workflow! 🫶
0
0
11
By popular demand, I updated https://t.co/pb1UMcV8mC to support Apple's Human Interface Guidelines (HIG). https://t.co/QkmkEJvxNg Let me know how that works for you! I'm interested to hear what differences you see in, e.g. SwiftUI code generation when the HIG is in context.
sosumi.ai
sosumi.ai provides Apple Developer documentation in an AI-readable format by converting JavaScript-rendered pages into Markdown.
3
10
127
I’m curious to hear from y'all: What are the biggest problems you've had building AI-powered apps in Swift? What would you like to see us work on next?
13
1
23
.@huggingface has long been a champion of open-source machine learning. And their recent work on MLX, Core ML, and Swift Transformers has done a lot to advance AI on Apple platforms. So when the opportunity came up to collaborate with them, it was a dream come true.
2
2
46
It's a shame. So many use cases would work just as well — if not better — with an off-the-shelf model running on-device. But the tooling isn't there yet, docs are scattered, examples are toys, and the path to production is murky.
1
0
15