
David Chiang
@davidweichiang
Followers
2K
Following
500
Media
35
Statuses
778
Associate Professor of Computer Science and Engineering at University of Notre Dame. Natural language processing, formal grammars, machine learning
South Bend, IN
Joined September 2012
By @pentagonalize, Lena Strobl, Dana Angluin, and me, on arXiv:
arxiv.org
We study conditions under which transformers using soft attention can simulate hard attention, that is, effectively focus all attention on a subset of positions. First, we examine several...
0
0
0
Andy Yang @pentagonalize drove the conceptualization, theory, and experiments of this work. I was just the checker and editor!.
0
0
5
RT @mhahn29: Very excited about this work: deep results from logic shedding light on Transformers and the benefit of depth.
0
3
0
New on arXiv: Knee-Deep in C-RASP, by @pentagonalize, Michael Cadilhac and me. The solid stepped line is our theoretical prediction based on what problems C-RASP can solve, and the numbers/colors are what transformers (no position embedding) can learn.
1
10
37
RT @huangxt233: I'll be presenting our paper together with @mhahn29 on Saturday morning poster session. Feel free to reach out!.
0
7
0
RT @AnganaBorah2: Last week, I had a fantastic time presenting our work on belief congruence in LLMs at the Midwest Speech and Language Day….
0
8
0
Congratulations to @aarsri21 on winning the Best Paper Award at W-NUT at NAACL 2025! This paper applies various interventions simulating noisy text or dialectal variation to discover how different interventions have different effects.
arxiv.org
We present a suite of experiments that allow us to understand the underlying challenges of language model adaptation to nonstandard text. We do so by designing interventions that approximate core...
2
4
24