ai_ngrosso Profile Banner
Alessandro Ingrosso Profile
Alessandro Ingrosso

@ai_ngrosso

Followers
727
Following
234
Media
12
Statuses
100

Theoretical neuroscience, machine learning and spin glasses. Assistant professor @Radboud_Uni.

Nijmegen, The Netherlands
Joined April 2018
Don't wanna be here? Send us removal request.
@ai_ngrosso
Alessandro Ingrosso
2 months
Please RT - Open PhD position in my group at the Donders Center for Neuroscience, Radboud University. We're looking for a PhD candidate interested in developing theories of learning in neural networks. Applications are open until October 20th. Info:
Tweet card summary image
ru.nl
Thank you for your interest in working at Radboud University. We are no longer taking applications for this job.
0
11
14
@ai_ngrosso
Alessandro Ingrosso
2 months
Just got back from a great summer school at Sapienza University https://t.co/2pfre6dYls where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested:
sites.google.com
The school will open the thematic period on Data Science and will be dedicated to the mathematical foundations and methods for high-dimensional data analysis. It will provide an in-depth introduction...
0
0
2
@DondersInst
Donders Institute
6 months
Please find this English masters program in Neurophysics. Study the brain, artificial neural networks and complex systems through the lens of mathematical modeling and physics. Radboud University / Donders (Netherlands) is one of Europe’s leading universities in Neuroscience. 1/3
0
3
5
@ai_ngrosso
Alessandro Ingrosso
7 months
Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (@fgerace_) and Parma (P. Rodondo, @rosalbapacelli). https://t.co/dcWGStHMMH
journals.aps.org
Tools from spin glass theory such as the replica method help explain the efficacy of transfer learning.
2
6
22
@ai_ngrosso
Alessandro Ingrosso
9 months
Biophysics, Stat Mech and Machine Learning will meet in Trento from July 7th to 11th, 2025 in our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI": https://t.co/89xU1dOpAP. Co-organized with @r_potestio lab.
0
1
5
@wredman4
Will Redman
9 months
New paper with @ai_ngrosso @VITAGroupUT @sebastiangoldt “On How Iterative Magnitude Pruning Discovers Local Receptive Fields in Fully Connected Neural Networks“ accepted at the conference on parsimony and learning (@CPALconf ) https://t.co/T0Vb5mLhTJ 1/
Tweet card summary image
arxiv.org
Since its use in the Lottery Ticket Hypothesis, iterative magnitude pruning (IMP) has become a popular method for extracting sparse subnetworks that can be trained to high performance. Despite its...
1
2
10
@ai_ngrosso
Alessandro Ingrosso
9 months
Our paper on Wang-Landau sampling in neural networks is now published in TMLR. Here's a thread by @r_potestio.
@r_potestio
Potestio Lab
9 months
🥳 We are pleased to announce the publication of our paper “Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space” in Trans. on Machine Learning Research. https://t.co/lYukUTjpMa @MeleMargherita_ @ai_ngrosso @UniTrento @INFN_ @DondersInst
0
4
7
@SaxeLab
Andrew Saxe
11 months
New paper with @leonlufkin and @ermgrant! Why do we see localized receptive fields so often, even in models without sparisity regularization? We present a theory in the minimal setting from @ai_ngrosso and @sebastiangoldt
@leonlufkin
Leon
11 months
We’re excited to share our paper analyzing how data drives the emergence of localized receptive fields in neural networks! w/ @SaxeLab @ermgrant Come see our #NeurIPS2024 spotlight poster today at 4:30–7:30 in the East Hall! Paper: https://t.co/U2I285LLAE
0
14
86
@ermgrant
Erin Grant
11 months
If you missed it at the #NeurIPS2024 posters! Work led by @LeonLufkin on analytical dynamics of localization in simple neural nets, as seen in real+artificial nets and distilled by @ai_ngrosso @sebastiangoldt Leon is a fantastic collaborator, and is looking for PhD positions!
@leonlufkin
Leon
11 months
We’re excited to share our paper analyzing how data drives the emergence of localized receptive fields in neural networks! w/ @SaxeLab @ermgrant Come see our #NeurIPS2024 spotlight poster today at 4:30–7:30 in the East Hall! Paper: https://t.co/U2I285LLAE
0
1
22
@ai_ngrosso
Alessandro Ingrosso
1 year
Ceci n'est pas un tweet: @aingrosso.bsky.social
0
0
0
@ai_ngrosso
Alessandro Ingrosso
1 year
Giving a talk on Stat Mech of Transfer Learning at 3pm EST today at the Deepmath conference in Philadelphia. Here's a link for the live stream:
@deepmath1
deepmath
1 year
The yearly tradition is already here . Watch DeepMath 2024 live here :
0
3
20
@JHCornford
Jonathan
1 year
Why does #compneuro need new learning methods? ANN models are usually trained with Gradient Descent (GD), which violates biological realities like Dale’s law and log-normal weights. Here we describe a superior learning algorithm for comp neuro: Exponentiated Gradients (EG)! 1/12
@biorxiv_neursci
bioRxiv Neuroscience
1 year
Brain-like learning with exponentiated gradients https://t.co/vJYzx399ed #biorxiv_neursci
8
70
323
@ai_ngrosso
Alessandro Ingrosso
1 year
Our work on Wang-Landau in neural network learning (a.k.a. Wang-Learnau) is now on arXiv. We use enhanced sampling to explore how the entire density of states of a loss function is affected by data structure. A collaboration with friends in Trento powered by @MeleMargherita_.
@r_potestio
Potestio Lab
1 year
📢 PREPRINT ALERT! 📢 “Density of states in neural networks: an in-depth exploration of learning in parameter space”, by M. Mele, R. Menichetti, A. Ingrosso, R. Potestio https://t.co/RODoHe7QzT @ai_ngrosso @MeleMargherita_ @UniTrento @Radboud_Uni
0
4
18
@zdeborova
Lenka Zdeborova
1 year
This mini-review is based on Hugo Cui's PhD thesis: https://t.co/DmVhuhlBmz . My advice to him was: "Write something you would have loved to have when you started your PhD!" He did an outstanding job introducing the rich methods he developed. Enjoy and share widely!
5
66
459
@carlolucibello
Carlo Lucibello
1 year
#RockinAI #Roccella Day 4 invited speakers: Alessandro Laio, Sebastian Goldt, Pietro Rotondo, Alessandro Ingrosso
0
3
8
@ai_ngrosso
Alessandro Ingrosso
1 year
I'll be leaving ICTP in September to start as Assistant Professor at @DondersInst, @Radboud_Uni in Nijmegen. Students interested in pursuing a PhD at the border of Machine Learning, Neuroscience and Statistical Mechanics, don't hesitate to contact me.
19
20
114
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli This work would not have happened without Federica Gerace and our friends Rosalba and Pietro. Looking forward to be in Roccella Jonica https://t.co/TSdGR6NXSa to talk about our work and eat a proper amount of nduja.
sites.google.com
ROccella Conference on INference and AI - ROCKIN' AI 2024
1
0
4
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli Using a previously developed synthetic model for source-target correlations (Correlated Hidden Manifold Model), we study how the similarity between tasks affects transfer and determines whether fine-tuning is helpful.
1
0
3
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli The effectiveness of Transfer Learning in fully connected networks with one hidden layer can be analyzed in a Bayesian setting, varying the interaction strength between the source and target network.
1
0
1
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli Fine-tuning features extracted from a pre-trained, data-adundant source task can boost generalization on a target task where data is scarce. Combining replicas with kernel renormalization, we built a theoretical framework to analyze Transfer Learning in the proportional limit.
1
0
3