CPALconf Profile Banner
Conference on Parsimony and Learning (CPAL) Profile
Conference on Parsimony and Learning (CPAL)

@CPALconf

Followers
944
Following
443
Media
53
Statuses
264

CPAL is a new annual research conference focused on the parsimonious, low dimensional structures that prevail in ML, signal processing, optimization, and beyond

Joined May 2023
Don't wanna be here? Send us removal request.
@CPALconf
Conference on Parsimony and Learning (CPAL)
2 months
Calling all parsimony and learning researchers 🚨🚨 The 3rd annual CPAL will be held in Tübingen Germany March 23–26, 2026! Check out this year's website for all the details https://t.co/Ra08OCHmA9
0
7
17
@Shiwei_Liu66
Shiwei Liu
7 days
17 days to go! Don’t miss the deadline!
@ELLISforEurope
ELLIS
7 days
📣 Call for Papers: Conference on Parsimony and Learning, 2026 Join @ELLISInst_Tue & @MPI_IS to explore and share your work on parsimony in ML, signal processing, optimisation & beyond. 🗓 March 23-26, 2026 📍 Tübingen 🇩🇪 🔗 Details & submission: https://t.co/9eYbNi3lIM
0
3
8
@CPALconf
Conference on Parsimony and Learning (CPAL)
6 days
Hey there 🎵 you’re a rock star 🎶submit to CPAL Rising Star Don’t miss out on your chance to be a CPAL Rising Star! Submit by December 15th https://t.co/GVExJzlaf4
0
1
5
@Jere_je_je
Jeremias Sulam
7 days
Deadline for CPAL coming up on Dec 5! Submit your best work on Parsimony and Learning and come join us in Tübingen in March!
@CPALconf
Conference on Parsimony and Learning (CPAL)
2 months
Calling all parsimony and learning researchers 🚨🚨 The 3rd annual CPAL will be held in Tübingen Germany March 23–26, 2026! Check out this year's website for all the details https://t.co/Ra08OCHmA9
0
2
3
@WeiHuang_USTC
Wei Huang
12 days
Tired of receiving AI-generated reviews? Try CPAL! CPAL keeps things small — so every paper is reviewed by real experts, not algorithms.
@CPALconf
Conference on Parsimony and Learning (CPAL)
13 days
Getting AI-written reviews for your AI paper? 😅 Try CPAL 2026 — a focused conference where humans actually read your work! Every paper gets careful attention from area chairs 🧑, not bots 🤖. Join us in Tübingen → https://t.co/cXKbuiJdsL   #CPAL2026 #AI
0
1
5
@Shiwei_Liu66
Shiwei Liu
12 days
Love seeing @OpenAI highlight sparse circuits — sparsity is finally getting the attention it deserves. In our earlier work, we showed how sparse training can unlock robustness, efficiency, and better scaling: ICML’21 • NeurIPS’21 • ICLR'22 • ICML'24 • ICLR'23. Many great
@nabla_theta
Leo Gao
12 days
Excited to share our latest work on untangling language models by training them with extremely sparse weights! We can isolate tiny circuits inside the model responsible for various simple behaviors and understand them unprecedentedly well.
3
13
80
@Shiwei_Liu66
Shiwei Liu
13 days
Tired of receiving AI-generated reviews? Try CPAL! CPAL keeps things small — so every paper is reviewed by real experts, not algorithms.
@CPALconf
Conference on Parsimony and Learning (CPAL)
13 days
Getting AI-written reviews for your AI paper? 😅 Try CPAL 2026 — a focused conference where humans actually read your work! Every paper gets careful attention from area chairs 🧑, not bots 🤖. Join us in Tübingen → https://t.co/cXKbuiJdsL   #CPAL2026 #AI
0
3
8
@CPALconf
Conference on Parsimony and Learning (CPAL)
13 days
Getting AI-written reviews for your AI paper? 😅 Try CPAL 2026 — a focused conference where humans actually read your work! Every paper gets careful attention from area chairs 🧑, not bots 🤖. Join us in Tübingen → https://t.co/cXKbuiJdsL   #CPAL2026 #AI
0
2
11
@CPALconf
Conference on Parsimony and Learning (CPAL)
28 days
CPAL 2026 has an INCREDIBLE list of confirmed keynote speakers: @BachFrancis @bschoelkopf @FannyYangETH @arkrause @MatthiasBethge @StefanieJegelka Niao He Taiji Suzuki Jared Tanner Ulrike von Luxburg
0
1
4
@CPALconf
Conference on Parsimony and Learning (CPAL)
1 month
Calling all junior researchers! Apply now for CPAL's Rising Star Award💫 A great opportunity to get recognized by the community and network with senior researchers! Apply now through December 15th: https://t.co/GVExJzkCpw
0
1
5
@Shiwei_Liu66
Shiwei Liu
1 month
Super excited to have CPAL 2026 at Tübingen. At least two reasons to submit to CPAL: (1) professional reviewers in your fields, (2) come to see how this beautiful old city is being reborn in the age of AI.
@CPALconf
Conference on Parsimony and Learning (CPAL)
2 months
Calling all parsimony and learning researchers 🚨🚨 The 3rd annual CPAL will be held in Tübingen Germany March 23–26, 2026! Check out this year's website for all the details https://t.co/Ra08OCHmA9
0
2
9
@CPALconf
Conference on Parsimony and Learning (CPAL)
1 month
Take a break from hunting demons and submit to CPAL! Now accepting submissions on OpenReview https://t.co/EkHOmmmgH5
0
1
5
@YiMaTweets
Yi Ma
1 month
ICCV 2025 tutorial today.
2
23
282
@CPALconf
Conference on Parsimony and Learning (CPAL)
2 months
Frustrated by a rejection from NeurIPS 😠? Consider submitting your work on sparsity/optimization/learning to CPAL, a conference and community that *appreciates* high quality work ☺️! Deadline December 5th: https://t.co/Ra08OCHmA9
0
2
6
@YiMaTweets
Yi Ma
2 months
I am very excited to announce that the third CPAL conference: https://t.co/xVwrpSEvoM is to be held at Tübingen, Germany in March 2026 (after Hong Kong and Stanford). Complementary to the arguably too many mega conferences on machine intelligence, CPAL aims to be focused and
Tweet card summary image
cpal.cc
Conference on Parsimony and Learning (CPAL) - Addressing the low-dimensional structures in high-dimensional data that prevail in machine learning, signal processing, optimization, and beyond.
0
12
30
@YiMaTweets
Yi Ma
3 months
The course/book is to be about AI and by AI.
@TianzheC
Tianzhe Chu
3 months
[1/3]🤨Many grad school courses teach you to train a LM but what if we train one to teach? Here is our practice! We build & deploy a cute 7B LM that can answer questions regarding the book. As it's only 7B, it will not relieve students from mathy homework.🤡
2
4
23
@ai_ngrosso
Alessandro Ingrosso
2 months
Please RT - Open PhD position in my group at the Donders Center for Neuroscience, Radboud University. We're looking for a PhD candidate interested in developing theories of learning in neural networks. Applications are open until October 20th. Info:
Tweet card summary image
ru.nl
Thank you for your interest in working at Radboud University. We are no longer taking applications for this job.
0
11
14
@YiMaTweets
Yi Ma
8 months
I came to realize that, for studying intelligence, the only "inductive bias" that one is allowed to assume is that the data worth sensing and learning has extremely low intrinsic dimension. This can be precisely verified and quantified by the data's volume, entropy, or coding
15
25
260
@yuxiangw_cs
Yu-Xiang Wang
8 months
#CPAL2025 invited speaker @tydsh shows that attention map first goes *sparser* than gets *denser* as we train a nonlinear attention mechanism using gradient flow — and the phenomena show up in the experiments too.
2
6
34
@YiMaTweets
Yi Ma
8 months
There are only two types of generic linear/matrix operations that truly scale: one is low-rank and the other is sparse.
@yuxiangw_cs
Yu-Xiang Wang
8 months
#CPAL2025 invited speaker @tydsh shows that attention map first goes *sparser* than gets *denser* as we train a nonlinear attention mechanism using gradient flow — and the phenomena show up in the experiments too.
4
5
66