Explore tweets tagged as #productioncoding
Gemini 2.5 Pro vs. OpenAI o3 It is twice as expensive, and here is why: #ProductionCoding #RustCoding
0
1
6
AIPack `pro@coder` 0.2.20 released! + Plan-based dev support - with dev/plan-todo.md and dev/plan-done.md templates + `max_files_size_kb: ...` support (defaults to 5MB), to prevent sending too much data. . Tunings aip install pro@coder
@AIPackAI #AICoding #ProductionCoding
1
1
3
Gemini 2.5 Pro vs. #OpenAI o3 Actual #Coding Costs Gemini 2.5 Pro is often 2.3x or more expensive than OpenAI o3. OpenAI o3 uses very little reasoning by default, which makes all the difference in cost. Below, exact same coding task (git restore . + replay) #ProductionCoding
2
0
3
Using #AIPACK pro@coder to code AIPACK New TUI in the building ... (AIPACK is now about 35k loc, but context control makes all the difference) #ProductionAI #ProductionCoding #BuiltInRust #rustlang
0
0
4
Live Production Rust Coding Now: Sat 31 May, 4PM PT https://t.co/zNxZ7qA9Yt
#RustLang #RustProgramming #ProductionCoding #YesWithAI
0
0
3
For anyone interested in a quick pricing comparison: My current production coding models are: - Bigger work ➜ Gemini 3 (backup 5.1-codex) - Simpler work ➜ 5.1-codex-mini (backup flash-latest) #AICoding #ProductionCoding #Gemini3 @openai
3
1
5
This seems obvious, but unfortunately some devs still skip basic best practices and Git discipline (especially when using AI). We do not have to be Git experts to use Git well. #ProductionCoding #programming
https://t.co/FEhptq39K8
1
1
5
Big release of #AIPack 0.8.10 and pro@coder 0.2.28 With attachment support (image and pdf) See demo below, turning static image to interactive chart. #vibecoding to #productioncoding @AIPackAI
https://t.co/Qk9AoyOj1R
0
1
4
GitHub Copilot AI PRs take much longer to review. There is a high noise-to-content ratio, especially compared to a good developer. Developers should use AI, but they should take ownership of their PRs. #MicromanageAI #UseAI #WithControl #ProductionCoding
1
0
4
AI For Production Coding Note: GPT-5 still codes better than #Sonnet 4.5, but GPT-5 is so slow (via API), that I am starting to use Sonnet more and more. Hope #OpenAI will solve this issue. Waiting for Gemini Pro 3. #OpenAI #Anthropic #GoogleGemini #ProductionCoding
0
0
1
JC Rust Live Coding ➜ Today 10:30am PT GenAI Library - Adding Multi-Content to Streaming https://t.co/n7UgD53Fw2
#LiveCoding #RealCoding #ProductionCoding #rustlang
0
1
5
Master pricing with AI coding: Craft a sharp prompt for Flash 2.5 for simple coding tasks. Input/output virtually free. Even with a few reasoning tokens at $3.50, still dirt cheap. Gemini 2.5 Pro & Flash are the perfect pair for #ProductionCoding
@OfficialLoganK @GoogleAI
0
0
4
0
0
4
👇My new coding pair👇 gpt-5-codex AND gemini-flash-latest Why: ➜ gpt-5-codex Slow, but the best coder. ➜ gemini-flash-latest (09-2025) Fast, cheap, and pretty good!! Why not #Sonnet 4.5: ➜ doesn't listen well and is expensive. #genai #AICoding #ProductionCoding
0
1
3
Ok, GLM 4.5 is pretty good at coding, and dirt cheap compared to Gemini 2.5 Pro. But for production coding, it comes short sometimes. Gemini 2.5 Pro is still the big boy there. #genai #AICoding #ProductionCoding cc @OfficialLoganK
1
0
4
@OfficialLoganK Awesomeness!!! Gemini 2.5 Pro + Gemini 2.5 Flash are the perfect #AICoding duo. #ProductionCoding
0
0
0
@OfficialLoganK Can't wait for "Deep Think" access via API! I will definitely integrate it into the #rust #genai crate. Gemini 2.5 Flash / Pro is such a perfect pair for #ProductionCoding.
0
0
1
@sdrzn Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control
0
0
1
#sonnet 4.5 seems to be MUCH FASTER than gpt-5[-codex] (via API) I'll try to get some hard numbers. OpenAI needs to fix their performance issue fast. #AICoding #ProductionCoding @AnthropicAI @OpenAI @GoogleAI @AipackAI
0
1
2