matthiasboehm7 Profile Banner
Matthias Boehm Profile
Matthias Boehm

@matthiasboehm7

Followers
1K
Following
33K
Media
108
Statuses
2K

Prof at TU Berlin and BIFOLD; research on ML systems and data management. @matthiasboehm7.bsky.social

Berlin, Germany
Joined May 2019
Don't wanna be here? Send us removal request.
@matthiasboehm7
Matthias Boehm
1 day
We have another open call for a PhD position on ML system internals in my group: https://t.co/STUlW6OrjN (funded for up to 5 years, application deadline Nov 21)
1
3
8
@JeffDean
Jeff Dean
3 days
Our 7th generation TPU (Ironwood) is becoming generally available for Cloud TPU customers. It's got greatly improved performance and efficiency over previous generations! 🎉
@sundarpichai
Sundar Pichai
3 days
Our 7th gen TPU Ironwood is coming to GA!  It’s our most powerful TPU yet: 10X peak performance improvement vs. TPU v5p, and more than 4X better performance per chip for both training + inference workloads vs. TPU v6e (Trillium). We use TPUs to train + serve our own frontier
31
73
1K
@andrewlamb1111
Andrew Lamb
6 days
If anyone wants to know why Xiangpeng Hao is a great mentor, they can read this response: https://t.co/Zvu9c046DJ
0
4
91
@matthiasboehm7
Matthias Boehm
5 days
The last two days, @BigEarthBIFOLD and I attended hearings at @PIK_Climate. We saw awesome candidates, great talks, interesting perspectives, and a beautiful campus. 🌳
0
1
2
@HoseKatja
Katja Hose
11 days
2 #Postdoc positions at TU Wien (Vienna, Austria) 1️⃣ Data Warehousing & System Performance https://t.co/dVT8VlhXei 2️⃣ Natural Language Interfaces, LLMs, Exploratory Data Access https://t.co/bDuOyF9PCC Application Deadline: Nov 13, 2025 Details:
0
2
7
@StasBekman
Stas Bekman
15 days
PyTorch announced Monarch which is meant to simplify distributed programming—your code looks and feels like a single-machine Python program, but can scale across thousands of GPUs. You can directly use Pythonic constructs—classes, functions, loops, tasks, futures—to express
15
80
908
@SnorkelAI
Snorkel AI
17 days
New benchmark drop 🚀 SnorkelSpatial tests how well LLMs can think in space, following text-based moves and rotations in a 2D world.
2
3
19
@karpathy
Andrej Karpathy
27 days
Excited to release new repo: nanochat! (it's among the most unhinged I've written). Unlike my earlier similar repo nanoGPT which only covered pretraining, nanochat is a minimal, from scratch, full-stack training/inference pipeline of a simple ChatGPT clone in a single,
664
3K
24K
@matthiasboehm7
Matthias Boehm
29 days
Last week we had another FONDA II (collaborative research center) retreat, and here is a picture of Albert Einstein's summer house from a post-lunch walk.
0
0
9
@vllm_project
vLLM
1 month
How does @deepseek_ai Sparse Attention (DSA) work? It has 2 components: the Lightning Indexer and Sparse Multi-Latent Attention (MLA). The indexer keeps a small key cache of 128 per token (vs. 512 for MLA). It scores incoming queries. The top-2048 tokens to pass to Sparse MLA.
@deepseek_ai
DeepSeek
1 month
🚀 Introducing DeepSeek-V3.2-Exp — our latest experimental model! ✨ Built on V3.1-Terminus, it debuts DeepSeek Sparse Attention(DSA) for faster, more efficient training & inference on long context. 👉 Now live on App, Web, and API. 💰 API prices cut by 50%+! 1/n
11
108
717
@matthiasboehm7
Matthias Boehm
1 month
A postcard from ADBIS/TPDL 2025 in Tampere, Finland last week. It was a great and really enjoyable event.
0
1
15
@matthiasboehm7
Matthias Boehm
2 months
Congratulations @SGrafberger on the successful PhD defense at @UvA_Amsterdam. 🎆 It was an extraordinary experience being on the committee and attending the event. All the best for the start at @Snowflake.
0
0
18
@sigmod
ACM SIGMOD
2 months
Dear Database Researchers, We are now seeking nominations from the database community for the 2025 Research Highlights. Check the following msg for details: https://t.co/jnjpCASAXW
0
2
2
@DAlistarh
Dan Alistarh
2 months
We're releasing the DASLab GGUF Quantization Toolkit! 🚀 First open-source toolkit bringing GPTQ + EvoPress to @ggerganov's GGUF format, enabling heterogeneous quantization based on importance. Result: Better models at the same file size. [1/5]
4
50
270
@matthiasboehm7
Matthias Boehm
2 months
A postcard from Berlin (some impressions from today's morning walk to the office)
0
0
11
@fchollet
François Chollet
2 months
To solve any problem, you don't have to be super smart. You just have to 1) be able to break down problems into subproblems, 2) be slightly smarter than the hardest of the atomic subproblems. The real challenge is that the process can take a very long time.
191
569
6K
@mlsec
Konrad Rieck 🌈
2 months
Got some hot research cooking? 🔥 The @satml_conf paper deadline is just 9 days away. We are looking forward to your work on security, privacy, and fairness in machine learning. 👉 https://t.co/cPFitltvjA ⏰ Sep 24
0
6
17
@jure
Jure Leskovec
2 months
Teaching in the age of AI has brought challenges I never expected—even leading my students and I back to paper exams. 📄✏️ Fortune just published a thoughtful piece on how we’re navigating the role of humans and machines in education, research, and beyond. You can read it here:
Tweet card summary image
fortune.com
"We had a big, I don't know, existential crisis among students a few years back," Jure Leskovec told Fortune, "when it kind of wasn't clear what our role is in this world."
3
20
88
@pvldb
PVLDB
2 months
Vol:18 No:12 → Enter the Warp: Fast and Adaptive Data Transfer with XDBC https://t.co/4xHs6FB2gg
0
3
10
@matthiasboehm7
Matthias Boehm
2 months
Congrats to all distinguished AEs and reviewers.
@HoseKatja
Katja Hose
2 months
Honored to have received a Distinguished Associate Editor Award at @VLDBconf 2025! It was a pleasure to contribute to the review process, and I'm truly grateful for the recognition. Congratulations to all awardees! #VLDB2025
0
0
7