Jinwoo Kim
@jw9730
Followers
521
Following
2K
Media
17
Statuses
560
PhD student at KAIST and visiting scholar at NYU, graph and geometric deep learning.
Joined August 2020
New preprint: Flock, a foundation model for link predictions on knowledge graphs that zero-shot generalizes to novel entities and relations. Instead of message passing, Flock operates on anonymized random walks, processed by sequence neural nets. Paper: https://t.co/bKmKwmh7Fa
2
20
121
Today, we’re announcing Kosmos, our newest AI Scientist, available to use now. Users estimate Kosmos does 6 months of work in a single day. One run can read 1,500 papers and write 42,000 lines of code. At least 79% of its findings are reproducible. Kosmos has made 7 discoveries
195
639
4K
There’s lots of symmetry in neural networks! 🔍 We survey where they appear, how they shape loss landscapes and learning dynamics, and applications in optimization, weight space learning, and much more. ➡️ Symmetry in Neural Network Parameter Spaces https://t.co/vzpKp3MvkI
8
45
230
Learning Linear Attention in Polynomial Time.
arxiv.org
Previous research has explored the computational expressivity of Transformer models in simulating Boolean circuits or Turing machines. However, the learnability of these simulators from...
0
10
57
[LG] Prior Makes It Possible: From Sublinear Graph Algorithms to LLM Test-Time Methods A Blum, D Hsu, C Rashtchian, D Saless [Toyota Technological Institute at Chicago & Columbia University & Google Research] (2025) https://t.co/mjHLCWZZKn
0
3
14
"Equivariance matters even more at larger scales" ~ https://t.co/hSdOxIP3UD All the more reason we need scalable architectures with symmetry awareness. I know this is an obvious ask but I'm still confident that scaling and inductive bias need not be at odds. This paper
1
23
181
We have shown a complete paradigm shift from existing KGFM design: no more message passing and deterministic equivariance! Thanks @jw9730, @kolejnyyyy , Kyungbin Min, @mmbronstein , Seunghoon Hoog, @ismaililkanc for the amazing collaboration!
New preprint: Flock, a foundation model for link predictions on knowledge graphs that zero-shot generalizes to novel entities and relations. Instead of message passing, Flock operates on anonymized random walks, processed by sequence neural nets. Paper: https://t.co/bKmKwmh7Fa
0
4
18
Very excited to share this! We introduce a new approach to knowledge graph foundation models built on probabilistic equivariance. The model is simple, expressive, and probabilistically equivariant — and it works remarkably well! Collaboration led by @jw9730 and @hxyscott.
New preprint: Flock, a foundation model for link predictions on knowledge graphs that zero-shot generalizes to novel entities and relations. Instead of message passing, Flock operates on anonymized random walks, processed by sequence neural nets. Paper: https://t.co/bKmKwmh7Fa
0
6
42
Flock is a joint work with amazing coauthors @hxyscott, Krzysztof Olejniczak, Kyungbin Min, @mmbronstein, Seunghoon Hong, and @ismaililkanc!! Preprint: https://t.co/bKmKwmh7Fa Code:
github.com
[arXiv'25] Flock: A Knowledge Graph Foundation Model via Learning on Random Walks, in PyTorch - jw9730/flock-pytorch
0
0
3
We lastly find that Flock benefits from two axes of scaling: pretraining data scaling and test-time scaling (of randomized predictions ensemble). This confirms its core characteristics of being a foundation model.
1
0
2
When pretrained on 3 KGs and tested on 54 diverse KGs for entity and relation prediction. Flock outperforms current KGFMs in zero‑shot and finetuning setups, with especially large gains in relation prediction.
1
0
3
We propose a new diagnostic dataset, Petals: KGs where relations are structurally isomorphic but semantically different. Deterministically equivariant KGFMs score ~50% (random guess); Flock reaches 100% accuracy.
1
0
3
Theory validates the design: (1) Flock is equivariant in probability. (2) With a sufficiently expressive sequence net, Flock is a universal approximator of link‑invariant functions on bounded‑size KGs.
1
0
3
We design Flock as follows, inspired by recent works on probabilistic invariance via random walks: - Run random walks - Anonymize entities & relations - Encode with a sequence net (GRU) - Pool tokens back to entities & relations.
1
0
3
To overcome this, we propose to use probabilistic node-relation equivariance, i.e. equivariance in distribution. This keeps the right inductive bias for zero-shot generalization, while allowing models to distinguish asymmetric semantics during prediction → better expressivity.
1
0
3
Our motivation comes from node-relation equivariance that underlies knowledge graph foundation models' (KGFMs) designs. We find it prevents distinguishing structurally similar but semantically distinct relations (e.g. like/dislike in figure). This ultimately limits expressivity.
1
0
3
[Group-averaged Markov chains] How to leverage group structure and averaging to accelerate mixing of Markov chains? This paper presents a modifier that can be supplemented to any existing MCMC sampler. Check it out! RG version: https://t.co/CCoNLOdvXV Submitted to arXiv.
4
32
194
「対称性と機械学習」という本を岩波書店より2025/9/18に出版します。この本は物理世界でみられる対称性を機械学習でどのように扱えるのかについて解説し、特にリー群で表される変換に対する対称性を扱います。 本の概要や、目次を含めたサポートページへのリンクはスレッド内で紹介します。
2
212
795
The countdown is on! Join us in 48 hours for a special announcement about Hollow Knight: Silksong! Premiering here: https://t.co/sPt1IzanDS
4K
33K
137K
🚨 Job Alert: We're hiring a Postdoc in Geometric Deep Learning & AI for Science! Join me & @mmbronstein in an exciting collaboration between AITHYRA and @tu_wien. Apply now 👉 https://t.co/owpm8HqUGN
#AIforScience #GeometricDeepLearning #Postdoc #AITHYRA #TUWien
aithyra.onlyfy.jobs
This job is not active anymore. It was closed on Oct 13, 2025.
1
12
44