devbyharshit Profile Banner
Harshit Anand Profile
Harshit Anand

@devbyharshit

Followers
158
Following
5K
Media
410
Statuses
3K

Senior Frontend Engineer | CSE @IIITD | Crafting slick UIs with Next.js, Typescript & TailwindCSS | Building the future, one pixel & pun at a time | Let’s code

Janakpuri, New Delhi
Joined November 2017
Don't wanna be here? Send us removal request.
@devbyharshit
Harshit Anand
2 years
🚀🔍 JavaScript Interview Question Alert! 🌟 💡 Challenge: Take a look at this code snippet: 🧐 Can you predict the output for each `console.log` statement? Dive into the world of object property access and share your insights! 💬🚀 #JavaScriptInterview #100DaysOfCode
1
0
3
@devbyharshit
Harshit Anand
11 days
God bless the one who invented Ctrl/Cmd + Z
0
0
0
@devbyharshit
Harshit Anand
20 days
The lesson: Great systems don’t guess. They observe, learn, and evolve. Where else do you think “adaptive optimization” could unlock huge performance wins? 👇 — End 🧵 #buildinginpublic #javascriptlearning
0
0
0
@devbyharshit
Harshit Anand
20 days
That balance is why JIT changed modern computing. Your code doesn’t need to start optimized — it just needs a chance to adapt.
1
0
0
@devbyharshit
Harshit Anand
20 days
But for anything long-running — servers, browsers, mobile runtimes, real-time apps — JIT gives you a rare combo: speed + flexibility + intelligence.
1
0
0
@devbyharshit
Harshit Anand
20 days
But JIT isn’t perfect. There is a warm-up period before peak speed. And it uses more memory to store profiles + optimized code. For tiny scripts or short-lived processes? Interpretation or AOT can still win.
1
0
0
@devbyharshit
Harshit Anand
20 days
Real-world examples? Everywhere: - JVM → enterprise apps at global scale - Chrome V8 → JavaScript engine behind fast websites - Android Runtime → makes apps snappy without killing battery If your device feels fast… JIT is probably working behind the scenes.
1
0
0
@devbyharshit
Harshit Anand
20 days
And here’s where JIT gets truly powerful: adaptive optimization. It doesn’t just compile once. It learns from how your app behaves: - Common branches - Repeated inputs - Data type patterns Then it rewrites the machine code on the fly. Your program literally gets faster as it.
1
0
0
@devbyharshit
Harshit Anand
20 days
The JIT engine has four core components: 🏁 Interpreter → fast startup 👀 Profiler → watches code behavior ⚙️ JIT compiler → optimizes only what matters 🚀 Compiled code → replaces the slow version Efficiency through focus.
1
0
0
@devbyharshit
Harshit Anand
20 days
Think of it like this 👇 Interpretation = translate every line every time AOT = translate the entire book before reading JIT = translate as you go + remember the repeated sections for instant recall That’s why JIT approaches AOT performance without AOT rigidity.
1
0
0
@devbyharshit
Harshit Anand
20 days
Once something is a hot spot, JIT converts it into native machine code—the stuff your CPU loves to execute instantly. Result: - No delay at launch - Huge speed boost over time - Smarter optimization than static compilers
1
0
0
@devbyharshit
Harshit Anand
20 days
Here’s how it works: When your program starts, it runs through an interpreter. Meanwhile, a profiler quietly tracks what parts run the most. Those frequently executed chunks? They become hot spots 🔥.
1
0
0
@devbyharshit
Harshit Anand
20 days
JIT = Just-in-Time compilation. It doesn’t compile everything up front. Instead, it waits… watches… and optimizes code during execution ⚡ Like a translator who starts slow—then memorizes the repeated sentences.
1
0
0
@devbyharshit
Harshit Anand
20 days
Interpreters are flexible. Compilers are fast. JIT said: why not both? Here’s how modern apps get speed and smarts 🧵
1
0
1
@devbyharshit
Harshit Anand
2 months
I should be sleeping right now like the rest but here I am.
0
0
0
@devbyharshit
Harshit Anand
3 months
Why am I getting such a privilege?
0
0
0
@devbyharshit
Harshit Anand
3 months
The lesson: optimization often beats brute force. Where else have you seen small tweaks unlock massive scale gains? — End 🧵
0
0
1
@devbyharshit
Harshit Anand
3 months
✨ Takeaway: Scaling isn’t always brute force. Swiggy saved millions with smart math: - Observe real usage - Cluster intelligently - Optimize without hurting UX Sometimes, a few pixels of compromise = massive wins.
1
0
0
@devbyharshit
Harshit Anand
3 months
🔄 But it’s not “set it & forget it.” - New devices = new widths - New video formats = new requirements Swiggy reruns clustering periodically to keep buckets optimal.
1
0
0
@devbyharshit
Harshit Anand
3 months
🚀 Results: - Fewer video transformations → big cost savings 💸 - Higher CDN cache hits → faster loads ⚡ - Quality unaffected → users didn’t notice 👏
1
0
0
@devbyharshit
Harshit Anand
3 months
📲 Enter the Bucketizer inside the app: - Takes the original request - Maps it to nearest bucket width via MediaBucketMap - Requests video with this “bucketized” URL Now DAM + CDN only deal with 8 widths instead of hundreds 🔥
1
0
0