David Zhang Profile
David Zhang

@davwzha

Followers
283
Following
814
Media
2
Statuses
42

ML research at Qualcomm AI

Amsterdam, The Netherlands
Joined July 2019
Don't wanna be here? Send us removal request.
@dereklim_lzh
Derek Lim
1 year
Our new workshop at ICLR 2025: Weight Space Learning: https://t.co/1CiTQXl3G1 Weights are data. We can learn from weights. Learning can outperform human-designed methods for optimization, interpretability, model merging, and more.
weight-space-learning.github.io
Neural Network Weights as a New Data Modality
4
59
339
@EliahuHorwitz
Eliahu Horwitz
1 year
📢Thrilled to announce our ICLR 2025 @iclr_conf workshop on Weight Space Learning, exploring model weights as a new data modality! 📢 Stay tuned for submission instructions and deadlines. https://t.co/DUag8P01JS #weightspace #weightspacelearning #ICLR2025 #iclr
@LuisOala
Luis Oala
1 year
2025 @iclr_conf workshops are now available at https://t.co/CQqL2geMde thank you to all the amazing teams that prepared proposals out of 120 strong submissions 40 proposals were selected for 2 days of workshops in singapore a blog post with links to workshop sites will follow
2
10
45
@syhw
Gabriel Synnaeve
1 year
Want to do research in code generation with LLMs and wonky deep learning from the 90s? We're recruiting one Master student (M2) intern for 2025 at FAIR Paris in my team
8
46
291
@guanhorng_liu
Guan-Horng Liu
1 year
📢Interested in #interning at #FAIR NY? Excited to share that I have one internship position available for #Summer2025 🙂! Looking for PhD interested in flow/diffusion models, optimal transport/control for structural problems. 🙌Send me your CV, website & GScholar by #Oct16th!
4
25
219
@TacoCohen
Taco Cohen
1 year
🚨 Attention aspiring PhD students: Meta / FAIR is looking for candidates for a joint academic/industry PhD! 🚨 Among others, the CodeGen team is looking for candidates to work on world models for code, discrete search & continuous optimization methods for long-term planning,
24
108
672
@BorisAKnyazev
Boris Knyazev
1 year
Optimization can be sped up by 50% using our NiNo model! It takes a history of parameter values and predicts future parameters leveraging "neural graphs". Accelerating Training with Neuron Interaction and Nowcasting Networks: https://t.co/hd9GOXA3rH code: https://t.co/hdQMTfxECd
@MiltosKofinas
Miltos Kofinas 🦋 @miltoskofinas.bsky.social
2 years
🔍How can we design neural networks that take neural network parameters as input? 🧪Our #ICLR2024 oral on "Graph Neural Networks for Learning Equivariant Representations of Neural Networks" answers this question! 📜: https://t.co/ax5il0xYoj 💻: https://t.co/g18ShZJ3Rg 🧵 [1/9]
6
16
61
@levelsio
@levelsio
1 year
🇪🇺 eu/acc A few weeks ago Mario Draghi asked my recommendations for his report that came out today about European competitiveness I had a call with him and summarized my problems with doing business in the EU I wrote this which is included in the report presented to the
699
1K
8K
@mnagel87
Markus Nagel
1 year
Are you pursuing a PhD and are you interested in working on efficiency of LLMs/LVMs? Then join our model efficiency team in #QualcommAIResearch for an internship! Apply below, we have openings for 2025 as well as autumn/winter 2024. https://t.co/vNrUT9wzrB
3
27
179
@damianborth
Damian Borth
1 year
🚀 Exciting News! 🚀 I am more than happy to share that I have officially begun my research sabbatical this week! Over the coming months, I am fortunate to be working as @TUeEAISI Visiting Professor at @TUeindhoven collaborating with @joavanschoren on some exciting ideas
0
1
14
@damianborth
Damian Borth
1 year
Pleasant surprise of @icmlconf: There is a growing Weight Space Learning Community out there - it was great to meet you all: Haggai Maron (@HaggaiMaron), Gal Chechik (@GalChechik), Konstantin Schürholt (@k_schuerholt), Eliahu Horwitz (@EliahuHorwitz), Derek Lim (@dereklim_lzh),
1
9
48
@NatashaEve4
Natasha Butt
1 year
Come see our poster #715 on CodeIt today at #ICML2024 13.30-15.00 Halle C. We approach ARC by self-improving LLMs with prioritized hindsight replay. @blazejmanczak @aukejw Corrado Rainone @davwzha @m_deff @TacoCohen
@NatashaEve4
Natasha Butt
2 years
Excited to share that our paper “CodeIt: Self-Improving Language Models with Prioritized Hindsight Replay” was accepted into ICML! @blazejmanczak @aukejw Corrado Rainone @davwzha @m_deff @TacoCohen 1/5
0
10
37
@_lewtun
Lewis Tunstall
1 year
Our solution write-up for the 1st AIMO Progress Prize is now out ✍️! https://t.co/sHpzAgJsRL In it, we share technical details on: ♾️💻 The 2-stage MuMathCode recipe we used to train NuminaMath 7B TIR with iterative SFT ⚖️ Evals on MATH - 56.3% for Stage 1 & 68.2% for Stage
4
59
234
@HaggaiMaron
Haggai Maron
2 years
Thanks, @KostasPenn, @CongyueD, and the team for inviting me. It's an honor to speak alongside this esteemed group of researchers! My talk will focus on our recent work on *Equivariant Weight Space Learning*: designing neural networks that can process other neural networks.
@KostasPenn
Kostas Daniilidis
2 years
Our Equivariant Vision workshop features five great speakers @erikjbekkers @HaggaiMaron @ninamiolane @_machc, and Leo Guibas, spotlight talks, posters, and a tutorial prepared for the vision audience. Come tomorrow, Tuesday, at 8:30am in Summit 321! Thank you @CongyueD for
1
7
44
@aukejw
Auke Wiggers
2 years
ARC is a tough reasoning benchmark where modern LLMs far underperform humans still. Great to see that there's serious additional backing! Coincidentally, we just open-sourced CodeIt, our LLM-improvement approach for ARC:
Tweet card summary image
github.com
Contribute to Qualcomm-AI-research/codeit development by creating an account on GitHub.
@fchollet
François Chollet
2 years
I'm partnering with @mikeknoop to launch ARC Prize: a $1,000,000 competition to create an AI that can adapt to novelty and solve simple reasoning problems. Let's get back on track towards AGI. Website: https://t.co/wNsM3IQgEI ARC Prize on @kaggle: https://t.co/Lhsh1RiWKq
0
3
10
@doughty_hazel
Hazel Doughty
2 years
I'm hiring for a fully funded #PhD position in #ComputerVision on 'Detailed Video Understanding' at @LIACS @UniLeiden. Apply before 22nd June. More info👇 https://t.co/QafgWY9co5
3
20
83
@MorningBrew
Morning Brew ☕️
2 years
Dating a model Dating a model (2004) (2024)
75
2K
15K
@GhadimiAtigMina
Mina Ghadimi
2 years
Don't miss our survey on "Hyperbolic Deep Learning in Computer Vision: A Survey"!🙌 #IJCV
@PascalMettes
Pascal Mettes
2 years
Our survey "Hyperbolic Deep Learning in Computer Vision: A Survey" has been accepted to #IJCV! The survey provides an organization of supervised and unsupervised hyperbolic literature. Online now: https://t.co/pHbOZKWqeq w/ @GhadimiAtigMina @mkellerressel @jeffhygu @syeung10
0
1
17
@NatashaEve4
Natasha Butt
2 years
Come checkout our poster on CodeIt today at the #ICLR2024 #AGIWorkshop 4pm Halle A7. We tackle @fchollet’s ARC by self-improving LLMs with prioritized hindsight replay. @blazejmanczak @aukejw Corrado Rainone @davwzha @m_deff @TacoCohen
@NatashaEve4
Natasha Butt
2 years
Excited to share that our paper “CodeIt: Self-Improving Language Models with Prioritized Hindsight Replay” was accepted into ICML! @blazejmanczak @aukejw Corrado Rainone @davwzha @m_deff @TacoCohen 1/5
0
4
17
@AllanZhou17
Allan Zhou
2 years
ilya knows that it's time for weight-space architectures to shine :>
1
3
30
@davwzha
David Zhang
2 years
Join us at tomorrows #ICLR2024 oral session in the afternoon to learn about how we can turn neural networks into data! Afterwards @MiltosKofinas @Cyanogenoid and I will be at the poster: #77 Graph Neural Networks for Learning Equivariant Representations of Neural Networks
0
3
22