Ahmet Koç
@ahmetkoc
Followers
1K
Following
2K
Media
74
Statuses
975
here for the coding / LLMs / ML content mainly. phd dropout, philosophy (ontology/theology) at uiuc. u.m. amherst. boğaziçi
istanbul
Joined October 2007
Coders are cooked❌ Designers are cooked ❌ NBA players are cooked ✅ , give it a few years for the labs to RL this to perfection and ✴️ https://t.co/R4Bqn6sJ84
First-ever real-world basketball demo by a humanoid robot 🤖🏀 Bonus: I became the first person to record a block against a humanoid🤭 #Robotics #AI #TechDemo #NBA
0
0
0
@Andercot @stephen_wolfram Definitely powerful. It then naturally follows that, the reality you experience is shaped by how you are constructed, as what you observe is, for you, the whole of reality. In this regard, similar to what Kant alluded in Critique of Pure Reason.
0
1
0
wow, gpt5 just works. seems there is nothing it can't handle (for coding). sonnet is still a bit better in front-end though. https://t.co/yiLA6sAYYI
0
0
0
Listening to Lake Wobegon stories by Garrison Keillor https://t.co/UfDhohltHu
0
0
1
what 👀 https://t.co/kjvF9FIYl2
Excited to share Penzai, a JAX research toolkit from @GoogleDeepMind for building, editing, and visualizing neural networks! Penzai makes it easy to see model internals and lets you inject custom logic anywhere. Check it out on GitHub: https://t.co/mas2uiMqj9
0
0
1
crazy times incoming https://t.co/Z2nkTtNYwg
"bigger compute budgets, more compute/$, and more effective compute via better algorithms" We're on pace for a "$1T training run" in the 2029-2033 range, requiring ~10^17 data points* Welcome to the steep part of the S-curve! * of course, data is not all created equal!
0
0
3
@tsarnick that is true. but of course, the moment you 'solve' this issue is the moment empirical evidence, implementation, etc becomes irrelevant/trivial as some temporal figments. as this is 'the' issue. you solve this =you solve meaning of life, hence the lack of interest in empirical🙂
0
1
2
I got a fully maxed out MacBook, mostly so I can run local models fast. And omg here is mixtral running on @ollama... almost cannot believe how fast it is. This is with no internet!! A model that beats GPT-3.5 running locally! What!
85
110
1K
We were able to embed all of Wikipedia in < 15 minutes on Modal Labs for ~ 17$. Here's what this actually means for an organization: 1. Unrestricted Rates: Eliminates the bottleneck of rate limits for large-scale operations. 2. Rapid Experimentation: Allows quick iterations of
24
111
1K
How do I create a task specific dataset? Easy, I label a few thousand samples using another model like mixtral and ft a model on it, evaluate, repeat
12
16
279
@MatthewBerman if human civilization were completely eradicated, a new civilisation could be completely constructed from this stick, bypassing million years of evolutionary steps.
8
2
62