t_andy_keller Profile Banner
Andy Keller Profile
Andy Keller

@t_andy_keller

Followers
4K
Following
2K
Media
44
Statuses
265

Postdoctoral Fellow at The Kempner Institute at Harvard University -- Somewhere between Brains & Bits. PhD at UvA, Intern @ Apple MLR, Prev @ Intel AI & Nervana

Joined March 2014
Don't wanna be here? Send us removal request.
@t_andy_keller
Andy Keller
4 months
Why do video models handle motion so poorly? It might be lack of motion equivariance. Very excited to introduce: Flow Equivariant RNNs (FERNNs), the first sequence models to respect symmetries over time. Paper: https://t.co/dkk43PyQe3 Blog: https://t.co/I1gpam1OL8 1/🧵
8
73
397
@AtlasKazemian
Atlas Kazemian
3 days
Super excited to share that my Master’s project, “Convolutional architectures are cortex-aligned de novo,” has been published in Nature Machine Intelligence! https://t.co/Zmy1XwymFB w/ @EricElmoznino @michaelfbonner
3
14
32
@isabelpapad
Isabel Papadimitriou
1 year
Do you want to understand how language models work, and how they can change language science? I'm recruiting PhD students at UBC Linguistics! The research will be fun, and Vancouver is lovely. So much cool NLP happening at UBC across both Ling and CS! https://t.co/IxKvy4Um1I
14
84
404
@BoZhao__
Bo Zhao
11 days
There’s lots of symmetry in neural networks! 🔍 We survey where they appear, how they shape loss landscapes and learning dynamics, and applications in optimization, weight space learning, and much more. ➡️ Symmetry in Neural Network Parameter Spaces https://t.co/vzpKp3MvkI
8
45
230
@iaifi_news
IAIFI
24 days
Join us today (Friday, October 24) at 1:00pm ET for our next IAIFI Colloquium featuring @t_andy_keller (Harvard University Kempner Institute): Flow Equivariance: Enforcing Time-Parameterized Symmetries in Sequence Models. Watch on YouTube: https://t.co/Syh6lxud2I
0
1
3
@erikjbekkers
Erik Bekkers
28 days
As promised after our great discussion, @chaitanyakjoshi! Your inspiring post led to our formal rejoinder: the Platonic Transformer. What if the "Equivariance vs. Scale" debate is a false premise? Our paper shows you can have both. 📄 Preprint: https://t.co/kd8MFiOmuG 1/9
@chaitjo
Chaitanya K. Joshi
6 months
After a long hiatus, I've started blogging again! My first post was a difficult one to write, because I don't want to keep repeating what's already in papers. I tried to give some nuanced and (hopefully) fresh takes on equivariance and geometry in molecular modelling.
1
28
93
@Napoolar
Thomas Fel
1 month
🕳️🐇Into the Rabbit Hull – Part II Continuing our interpretation of DINOv2, the second part of our study concerns the geometry of concepts and the synthesis of our findings toward a new representational phenomenology: the Minkowski Representation Hypothesis
5
69
382
@Napoolar
Thomas Fel
1 month
🕳️🐇Into the Rabbit Hull – Part I (Part II tomorrow) An interpretability deep dive into DINOv2, one of vision’s most important foundation models. And today is Part I, buckle up, we're exploring some of its most charming features.
10
121
643
@MillerLabMIT
Earl K. Miller
1 month
Appearing soon: State-space trajectories and traveling waves following distraction, J of Cog. Neuro., in press. A direct link between spiking patterns moving through subspace and traveling waves propagating across the cortex. Preprint: https://t.co/8bbQTaE8Ko #neuroscience
Tweet card summary image
biorxiv.org
Cortical activity shows the ability to recover from distractions. We analyzed neural activity from the prefrontal cortex (PFC) of monkeys performing working memory tasks with mid-memory-delay...
3
16
87
@phogat_richa
RichaPhogat
2 months
New preprint alert 🧠 Ever wondered how the cortex and hippocampus communicate during health and what changes during disease? With @DrBreaky and an amazing team, we built the first geometry-aware model of cortico-hippocampal interactions to answer this. [ https://t.co/oyC1HZKqvQ]
Tweet card summary image
biorxiv.org
Functional interactions between cortex and hippocampus play a central role in cognition and are disrupted in major neurological disorders, but the mechanisms underlying coordinated cortico-hippocam...
4
28
75
@jensegholm
Jens E. Pedersen - @[email protected]
2 months
New paper on covariant #neuromorphic networks! We're connecting decades of work in computer vision with decades of work in spiking networks to present spatio-temporal spiking processing. https://t.co/uFlA3I6BuI
2
8
34
@t_andy_keller
Andy Keller
3 months
Incredibly rigorous and precise study of the one of the most fundamental visual processing capabilities of deep feedforward neural networks, contour integration -- just great science. Kudos Fenil! 👏
@fenildoshi009
Fenil Doshi
3 months
🧵 Can a purely feedforward network — with no recurrence or lateral connections — capture human-like perceptual organization? 🤯 Yes! Especially for contour integration, and we pinpoint the key inductive biases. New paper in @PLOSCompBiol with @talia_konkle & @grez72! 1/24
1
2
29
@t_andy_keller
Andy Keller
3 months
Great thread! Very excited to see the 'more' that comes next!
@StphTphsn1
Stéphane Deny
3 months
In a classic study of "mental rotation", Shepard and Metzler (1971) found that the time to compare two 3D cube-made objects was proportional to their angular difference. But *what is going in the brain* during this process? 🔗 Metzler & Shepard (1971): https://t.co/fbtMnbPZR0
0
0
7
@GretaTuckute
Greta Tuckute
3 months
Humans largely learn language through speech. In contrast, most LLMs learn from pre-tokenized text. In our #Interspeech2025 paper, we introduce AuriStream: a simple, causal model that learns phoneme, word & semantic information from speech. Poster P6, Aug 19 at 13:30, Foyer 2.2!
8
33
194
@t_andy_keller
Andy Keller
3 months
Looking forward to seeing @mozesjacobs give a short talk on this at CCN (@CogCompNeuro) in ~30mins! Watch the recorded livestream below or come by poster C171 to say hi! Stream:
@t_andy_keller
Andy Keller
8 months
In the physical world, almost all information is transmitted through traveling waves -- why should it be any different in your neural network? Super excited to share recent work with the brilliant @mozesjacobs: "Traveling Waves Integrate Spatial Information Through Time" 1/14
0
5
40
@StphTphsn1
Stéphane Deny
4 months
Equivariance meets RNNs. An exciting research direction!
@t_andy_keller
Andy Keller
4 months
Why do video models handle motion so poorly? It might be lack of motion equivariance. Very excited to introduce: Flow Equivariant RNNs (FERNNs), the first sequence models to respect symmetries over time. Paper: https://t.co/dkk43PyQe3 Blog: https://t.co/I1gpam1OL8 1/🧵
0
2
23
@KempnerInst
Kempner Institute at Harvard University
4 months
New in the #DeeperLearningBlog: #KempnerInstitute research fellow @t_andy_keller introduces the first flow equivariant neural networks, which reflect motion symmetries, greatly enhancing generalization and sequence modeling. https://t.co/B1YVESrRcR #AI #NeuroAI
Tweet card summary image
kempnerinstitute.harvard.edu
Sequence transformations, like visual motion, dominate the world around us, but are poorly handled by current models. We introduce the first flow equivariant models that respect these motion symmet...
0
4
16
@t_andy_keller
Andy Keller
4 months
Huge thanks to all my friends and advisors who helped me develop this work. Specifically, this paper would never have happened without @wellingmax's guidance. See the blog for an intro, and the paper for all the proofs! Blog: https://t.co/I1gpam1OL8 Code:
Tweet card summary image
github.com
Official repository for the paper "Flow Equivariant Recurrent Neural Networks" - akandykeller/FERNN
0
1
29
@t_andy_keller
Andy Keller
4 months
Excitingly, I also think that FERNNs give us a formal bridge between spatiotemporal neural dynamics and equivariance. Brains are recurrent, embodied sequence learners; cortical traveling waves may be doing exactly this kind of frame shift. See prior work:
@t_andy_keller
Andy Keller
2 years
Traveling waves are known to exist throughout the brain in a variety of forms — there are many hypotheses, but their exact computational role is debated. Together with @wellingmax we built an RNN which exhibits traveling waves to see what it could do. Here’s what we think: 1/7
1
0
12