RealTimeBytes Profile Banner
AKHIL Profile
AKHIL

@RealTimeBytes

Followers
181
Following
19K
Media
587
Statuses
5K

Curious mind exploring interesting ideas 🧠✨ Sharing thoughts and the occasional emoji reaction πŸ˜‚ Keep it simple, keep it real πŸ’―

Joined August 2024
Don't wanna be here? Send us removal request.
@RealTimeBytes
AKHIL
13 hours
In 2010, BlackBerry phones followed a strategy similar to the current iPhone, sticking to their dialogue of prioritizing privacy while others shifted to touch phones. BlackBerry eventually stopped manufacturing. Apple's situation mirrors this: Siri lacks advanced AI features,
0
0
0
@RealTimeBytes
AKHIL
15 hours
This approach fundamentally changes how we might tackle long-context modeling. By transforming text into a visual modality, Glyph unlocks new levels of efficiency and speed. The paper is out now, and it's a must-read. Read the full paper: https://t.co/ZwR7WOqcNS
Tweet card summary image
github.com
Official Repository for "Glyph: Scaling Context Windows via Visual-Text Compression" - thu-coai/Glyph
0
0
0
@RealTimeBytes
AKHIL
15 hours
How do they make this work? It's a sophisticated 3-stage framework: 1️⃣ Continual pre-training on massive amounts of rendered long-text data. 2️⃣ An LLM-driven genetic search to find the absolute optimal rendering configurations. 3️⃣ Final post-training with SFT & RL to fine-tune
1
0
0
@RealTimeBytes
AKHIL
15 hours
And the results are staggering. 🀯 Glyph achieves competitive results against text-based LLMs that require 3x to 3.5x LONGER context windows. It shows notable compression and massive inference speedups, especially on 128K-token inputs.
1
0
0
@RealTimeBytes
AKHIL
15 hours
This is Glyph: "Scaling Context Windows via Visual-Text Compression." Instead of feeding endless text tokens to an LLM, Glyph compresses the entire long-form text into a single, compact image and uses a Vision-Language Model (VLM) to process it. It's visual-text compression.
1
0
0
@RealTimeBytes
AKHIL
15 hours
🚨 Is your LLM constantly hitting its context limit? What if the solution wasn't more text, but... images? This might break your brain. There's a new method that renders long text into compact images to achieve massive context scaling. Meet Glyph. πŸ§΅πŸ‘‡
1
0
0
@RealTimeBytes
AKHIL
17 hours
Ready for your first genius AI answer? Ask your AI right now: "Explain how a smart contract works using ONLY a comic book analogy." Drop your best AI answer in the comments! πŸ‘‡
0
0
0
@RealTimeBytes
AKHIL
17 hours
This hack works because it turns a simple request (prompting) into a creative challenge. Kitchen Analogies 🍳 (e.g., CPU is the chef, RAM is the prep counter). Sports πŸ€ (e.g., explaining networking as a basketball team). Comic Books 🦸 (e.g., explaining gravity using superhero
1
0
0
@RealTimeBytes
AKHIL
17 hours
The "Fake Constraint" is simply telling the AI to explain a complex topic using a specific, often silly, rule or theme. As a beginner, think of it like this: you're giving the AI a costume for its answer. Example: Ask it to explain E=mc^2 using ONLY analogies from 🏈 sports or 🍳
1
0
0
@RealTimeBytes
AKHIL
17 hours
STOP settling for boring AI answers! πŸ›‘ The secret to getting truly GENIUS responses that blow your mind is the "Fake Constraint" hack. This trick forces the AI to be a creative expert, not just a factual one. You have to try this! πŸ‘‡
1
0
0
@RealTimeBytes
AKHIL
19 hours
Another open source model. With low cost
@MiniMax__AI
MiniMax (official)
1 day
We’re open-sourcing MiniMax M2 β€” Agent & Code Native, at 8% Claude Sonnet price, ~2x faster ⚑ Global FREE for a limited time via MiniMax Agent & API - Advanced Coding Capability: Engineered for end-to-end developer workflows. Strong capability on a wide-range of applications
0
0
0
@RealTimeBytes
AKHIL
1 day
What's the real bottleneck holding AI back? It's not ideas. It's "GPUs and energy." He says every major lab is in the same boat: "We could run more experiments in parallel, but we just don't have the GPUs." The entire AI race is a race for compute [ you can watch full podcast
0
0
0
@RealTimeBytes
AKHIL
1 day
Is an "AI Winter" coming? Kaiser's direct answer: No. "If anything, it may actually have a very sharp improvement in the next year or two, which is something to... almost be a little scared of." The old paradigm is plateauing, but the reasoning paradigm is just beginning its
1
0
0
@RealTimeBytes
AKHIL
1 day
If it can't find the info, it will finally say "I don't know." Kaiser says Reasoners also learn from "another order of magnitude less data." They don't need to see the whole internet to learn math; they can be trained on a tiny (by comparison) set of math problems.
1
0
0
@RealTimeBytes
AKHIL
1 day
This is why hallucinations are finally disappearing. An old LLM asked "When does the San Francisco Zoo open?" would guess "10 AM" because it's seen that text. A Reasoner will think: "I need to find this info." It will use a tool (Google search), check the current website, and
1
0
0
@RealTimeBytes
AKHIL
1 day
How are "Reasoners" different? Old LLMs guess the next word. Reasoners think first. They generate a private chain of thought (tokens for themselves) before giving you an answer. They are trained with Reinforcement Learning to find the correct answer, not just the most probable
1
0
0
@RealTimeBytes
AKHIL
1 day
First, he says the 2017 Transformer paper wasn't a "eureka" moment. It was a practical solution. RNNs were too slow (sequential), and they combined existing ideas (attention) with a bunch of crucial "tweaks" (multi-head, warm-ups) that finally made it work.The key takeaway: We're
1
0
0
@RealTimeBytes
AKHIL
1 day
I just watched a podcast with Łukasz Kaiser, one of the 8 authors of the "Attention Is All You Need" paper that created Transformers (i.e., the "T" in ChatGPT). He explains we're ALREADY moving past it. Here’s what's next. 🧡
@JonhernandezIA
Jon Hernandez
4 days
Full Podcast AI:
1
0
0
@RealTimeBytes
AKHIL
2 days
Why It Works for Beginners 🧠 When you use the "packed auditorium" phrase, the AI knows it has to: Use simple language. Cover all the main points without missing anything. Anticipate and prepare for follow-up questions
0
0
0
@RealTimeBytes
AKHIL
2 days
The PROMPT HACK: Ask the AI to present the concept like a teacher speaking to a crowd! Just type: "Explain [Your Concept] to a packed auditorium." Example: "Explain Quantum Computing to a packed auditorium."
1
0
0