Explore tweets tagged as #AIProductionCoding
Actually, #OpenAI o3 is pretty good at coding. #Gemini 2.5 pro good. #AIProductionCoding @aipackai #AIPACKProCoder
0
0
3
- #OpenAI o1-preview is great for #ProductionCoding - Implemented After All for #rustlang #devai, flawless - But it's a bit expensive, $1.2 for one file, 2 iterations - The $60/M Output Token can add up quickly #genai #AICoding #AiProductionCoding
0
0
0
@OfficialLoganK The passion and dedication are commendable. Thank you and the team. So far Gemini 3 for AI Production Coding has been amazing. #rustlang #AIProductionCoding
0
0
2
Planning to rename `jc@coder` AI Pack to `code10x@coder` More packs will be under the `code10x@` namespace: - `code10x@rust` for Rust best practices - `code10x@kdd` (for Kubernetes development) - `code10x@dom-native` for Native web components. #AIProductionCoding @AiapackAI
0
0
0
#AIProductionCoding Gemini Flash 2.5 is really good at code documentation. So, it's a great model to refresh, update, consolidate docs. And it's virtually free. A perfect coding partner with Gemini Pro 2.5. @GoogleAI @OfficialLoganK
0
0
1
#AIProductionCoding Important to be able to swtich between Claude / Gemini 2.5 Pro. Sometime, 2.5 pro go biserk on requirements. #genai #AICoding @AipackAI
0
0
0
One single TypeScript Spec file for #MCP Thank you @AnthropicAI for this! Single spec file should be the norm for AI Production Coding. We are starting to do that for our internal and external libs, and then use those single files as knowledge. #genai #AIProductionCoding
0
0
1
Implementing attachment support in #AIPack and image support in `pro@coder` ... release and video coming ... #AIProductionCoding #genai @AipackAI
0
1
3
Can't wait for Gemini Pro 3 API gpt-5 / gpt-5-codex are so slow, barely usable. #genai #AIProductionCoding @AIPackAI
0
0
2
Done: #devai scripting fully migrated to #Lua (from Rhai). Rust and #Lua make a great duo (thanks to mlua) Runs a scriptable agent under 10MB, all included. Python not required. https://t.co/noabiMalEg Preparing for the Devai 0.5.0-WIP release. @rustlang #AiProductionCoding
0
0
2
Ok, gpt-5.1 and gpt-5.1-codex seems to be pretty fast. #rustlang #AIProductionCoding
0
0
0
@sdrzn Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control
0
0
1
Man, this gpt-5.1-codex is good and fast, and relatively cheap because it is very efficient with reasoning tokens. (for Rust Coding and more) #rustlang @OpenAI #AIProductionCoding #AIPack
0
0
2
Trick for #AIProductionCoding Generate the ` https://t.co/G71MZowFpX` for the external libraries you use (from md docs) Reduce context by 100x to 1000x. Get 2x to 5x better code precision. Gemini Flash is prefect for that (high context, good output) @OfficialLoganK
0
1
4
So far: ➜ `gpt-5.1` Surprisingly fast and cheap for simple task ➜ `gpt-5.1-codex` Seems good and faster than 5-codex ➜ `gpt-5.1-codex-mini` Seems pretty good, even cheaper, and faster than `gemini-flash-latest` so far. @OpenAI @GoogleAI #GoogleGemini #AIProductionCoding
0
0
0
`jc@coder` v0.1.4 AI PACK is here! - Claude/@AnthropicAI caching - Working_globs for AI lensing and concurrency - "Show doc" prompt - Plus a bunch of other improvements BTW, `jc@coder` now generates about 15-30% of @AipackAI code. https://t.co/nX3ufUdGbc
#AiProductionCoding
0
0
0
Interesting @AnthropicAI haiku-4.5 price increase Now haiku is at 2x to 3x the price of gemini-flash-latest Will compare the two soon... @GoogleAI #genai #AIProductionCoding
0
1
3
My new AI coding pair is: Bigger tasks ➜ Opus 4.5 ➜ Fast, top quality coder Smaller tasks ➜ gemini-flash-latest ➜ Cheap, good for up to medium complexity #AIProductionCoding @AnthropicAI @GoogleAI @OpenAI
0
0
3
@mattpocockuk Interestingly, when using pro@coder we switch back and forth between planning and non-planning. We even interlace this with spec-based approaches. ➜ parametric prompts that allow us to toggle these modes on and off depending on the intent #AIPack #ProCoder #AIProductionCoding
0
0
0