neurobongo Profile Banner
David Cox Profile
David Cox

@neurobongo

Followers
12K
Following
39K
Media
2K
Statuses
22K

VP, AI Models @IBMResearch, IBM Director, @MITIBMLab. Former prof and serial/parallel entrepreneur.

Cambridge, MA
Joined March 2011
Don't wanna be here? Send us removal request.
@nexa_ai
NEXA AI
1 month
For the first time, the latest LLMs run on the @Apple Neural Engine — and NexaSDK is the only framework that makes it possible, powered by the NexaML engine. Last year, our two co-founders were invited by @Apple DMLI team (Data & Machine Learning Innovation) to share their
3
4
20
@sach1n
Sachin Desai
2 months
This is Granite 4.0 Nano running on an iPhone 17 Pro with MLX Swift. This is the 350M parameter model and the 1B runs equally well. @IBMDeveloper
1
4
14
@Marktechpost
Marktechpost AI Dev News ⚡
2 months
@Marktechpost
Marktechpost AI Dev News ⚡
2 months
IBM AI Team Releases Granite 4.0 Nano Series: Compact and Open-Source Small Models Built for AI at the Edge Small models are often blocked by poor instruction tuning, weak tool use formats, and missing governance. IBM AI team released Granite 4.0 Nano, a small model family that
0
1
1
@qubitium
Qubitium
2 months
🏔️Granite 4 Nano quantization support has been added to GPT-QModel! Both H-1B and H-350M W4A16 quantized models now available on HF. 🤗 Dl link in comments. Our eval scores also validate they are indeed tier-one small models. 👇 @neurobongo
@IBMDeveloper
IBM Developer
2 months
Introducing Granite 4.0 Nano, compact and open-source models built for AI at the edge. Available in 350M and 1B, for building AI on laptops and mobile devices: https://t.co/bUMyuLJvsb
1
1
1
@xenovacom
Xenova
2 months
IBM just released Granite-4.0 Nano, their smallest LLMs ever (300M & 1B)! 😍 The models demonstrate remarkable instruction following and tool calling capabilities, and can even run locally in-browser! This means they can interact with websites and call browser APIs for you! 🤯
8
79
518
@Marktechpost
Marktechpost AI Dev News ⚡
2 months
IBM AI Team Releases Granite 4.0 Nano Series: Compact and Open-Source Small Models Built for AI at the Edge Small models are often blocked by poor instruction tuning, weak tool use formats, and missing governance. IBM AI team released Granite 4.0 Nano, a small model family that
0
8
13
@neurobongo
David Cox
2 months
We just dropped the "nano" versions of our Granite 4 language models (1B and 350M sizes, in both vanilla and hybrid SSM versions). I will let the Pareto curve speak for itself (full benchmark details in the link below). Try them out and let us know what you think.
2
3
7
@adrgrondin
Adrien Grondin
3 months
Granite 4.0 H Micro and Tiny models are now available in @LocallyAIApp Micro is available for all devices Tiny is available for iPhone 17 Pro, iPhone Air and iPads
@adrgrondin
Adrien Grondin
3 months
Granite 4.0 H Tiny (4-bit) by @IBM running on iPhone 17 Pro at ~40tk/s with MLX 7B total parameters with 1B active using less than 5GB of RAM, extremely good in benchmarks for it’s memory footprint IBM did a great job with this one, it’s fast and efficient for the size
7
2
59
@lmstudio
LM Studio
3 months
Granite 4.0 from @IBM, family of new models: • Micro: transformer, dense 3B • Micro H: hybrid, dense 3B • Tiny H: hybrid, MoE 7B/1B active • Small H: hybrid, MoE 32B/9B active Apache 2.0. Trained for tool calling and RAG. Requires LM Studio 0.3.28! https://t.co/5j0rXVBGTI
Tweet card summary image
lmstudio.ai
Granite 4.0 is a a hybrid MoE model from IBM, trained for tool use and RAG use cases.
17
49
546
@kaggle
Kaggle
3 months
🚀 New Model Launch: Granite 4.0 @IBM's Granite 4.0 is the latest open-source SLM, built for fast inference, long-context understanding (tested on up to 128K tokens), and cost-efficient deployments. Multiple model sizes provide flexibility for different hardware and use cases.
7
14
122
@UnslothAI
Unsloth AI
3 months
IBM releases Granite-4.0, their new series of open models! Run the 'Micro' 3B model on 4GB RAM or 'Small' 32B on 40GB RAM. Granite-4.0 excels at agentic tasks, doc analysis, RAG, edge AI applications & more! Dynamic GGUFs: https://t.co/uK9KoaqLbw Guide: https://t.co/q9TW0k7hMd
21
125
777
@xenovacom
Xenova
3 months
IBM just released Granite 4.0, their latest series of small language models! These models excel at agentic workflows (tool calling), document analysis, RAG, and more. 🚀 The "Micro" (3.4B) model can even run 100% locally in your browser on WebGPU, powered by 🤗 Transformers.js!
21
153
1K
@ClementDelangue
clem 🤗
3 months
IBM is back! They just joined Hugging Face Enterprise & released Granite 4.0 in open-source with a new hybrid Mamba/transformer architecture that reduces memory requirements without reducing accuracy much. This set of models is great for agentic workflows like tool calling,
27
116
781
@Dorialexander
Alexander Doria
3 months
ibm suiting up again after llama 4 fumbled.
@Dorialexander
Alexander Doria
3 months
we finally have western qwen and you won't ever believe who this is.
6
7
138
@Tu7uruu
steven
7 months
New top ASR model on the Open ASR Leaderboard! We've just added IBM Granite-Speech-3.3-8b and 2b to the leaderboard! Open weights and top scores across benchmarks.
3
18
119
@Yikang_Shen
Yikang Shen
1 year
Stick-Breaking Attention: Out-of-box length extrapolation, thanks to removing the position embedding; Better performance than Softmax+RoPE on almost every task; Similar efficient implementation like Flash Attention. Do we still need Softmax+RoPE for Language Models?
9
27
170
@ailozovskaya
Alina Lozovskaya
1 year
🚀 Introducing IBM Granite 3.0 models on the 🤗 Open LLM Leaderboard! Granite 3.0 is IBM’s newest suite of lightweight, multilingual, and versatile open foundation models designed for enterprise and customization. These models excel in coding, reasoning, and tool usage,
2
10
24
@replicate
Replicate
1 year
We've partnered with IBM to bring the new Granite 3.0 language models to Replicate. These models are trained on license-permissible data collected following IBM’s AI Ethics principles for trustworthy enterprise usage. Best of all, they're fully open-source and Apache 2.0
2
8
42
@danielnewmanUV
Daniel Newman
1 year
I had the chance to dive deep into the new models coming out from IBM with its introduction of Granite 3.0. Upon initial review, the work that is being done by IBM is at the leading edge of what can be done for smaller language models being utilized for generative AI solutions.
@VentureBeat
VentureBeat
1 year
IBM debuts open source Granite 3.0 LLMs for enterprise AI
0
7
12