ai_ngrosso Profile Banner
Alessandro Ingrosso Profile
Alessandro Ingrosso

@ai_ngrosso

Followers
713
Following
231
Media
12
Statuses
98

Theoretical neuroscientist and spin glass enthusiast. Assistant professor @Radboud_Uni.

Nijmegen, The Netherlands
Joined April 2018
Don't wanna be here? Send us removal request.
@ai_ngrosso
Alessandro Ingrosso
2 months
RT @DondersInst: Please find this English masters program in Neurophysics. Study the brain, artificial neural networks and complex systems….
0
3
0
@ai_ngrosso
Alessandro Ingrosso
2 months
Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (@fgerace_) and Parma (P. Rodondo, @rosalbapacelli).
2
6
22
@ai_ngrosso
Alessandro Ingrosso
4 months
Biophysics, Stat Mech and Machine Learning will meet in Trento from July 7th to 11th, 2025 in our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI": Co-organized with @r_potestio lab.
Tweet media one
0
1
6
@ai_ngrosso
Alessandro Ingrosso
5 months
RT @wredman4: New paper with @ai_ngrosso @VITAGroupUT @sebastiangoldt “On How Iterative Magnitude Pruning Discovers Local Receptive Fiel….
0
3
0
@ai_ngrosso
Alessandro Ingrosso
5 months
Our paper on Wang-Landau sampling in neural networks is now published in TMLR. Here's a thread by @r_potestio.
@r_potestio
Potestio Lab
5 months
🥳 We are pleased to announce the publication of our paper “Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space” in Trans. on Machine Learning Research. @MeleMargherita_ @ai_ngrosso @UniTrento @INFN_ @DondersInst.
0
3
6
@ai_ngrosso
Alessandro Ingrosso
7 months
RT @SaxeLab: New paper with @leonlufkin and @ermgrant! . Why do we see localized receptive fields so often, even in models without sparisit….
0
14
0
@ai_ngrosso
Alessandro Ingrosso
7 months
RT @ermgrant: If you missed it at the #NeurIPS2024 posters! Work led by @LeonLufkin on analytical dynamics of localization in simple neural….
0
1
0
@ai_ngrosso
Alessandro Ingrosso
8 months
Ceci n'est pas un tweet:. @aingrosso.bsky.social.
0
0
0
@ai_ngrosso
Alessandro Ingrosso
8 months
Giving a talk on Stat Mech of Transfer Learning at 3pm EST today at the Deepmath conference in Philadelphia. Here's a link for the live stream:.
@deepmath1
deepmath
8 months
The yearly tradition is already here . Watch DeepMath 2024 live here :
0
3
20
@ai_ngrosso
Alessandro Ingrosso
9 months
RT @JHCornford: Why does #compneuro need new learning methods? ANN models are usually trained with Gradient Descent (GD), which violates bi….
0
70
0
@ai_ngrosso
Alessandro Ingrosso
9 months
Our work on Wang-Landau in neural network learning (a.k.a. Wang-Learnau) is now on arXiv. We use enhanced sampling to explore how the entire density of states of a loss function is affected by data structure. A collaboration with friends in Trento powered by @MeleMargherita_.
@r_potestio
Potestio Lab
9 months
📢 PREPRINT ALERT! 📢. “Density of states in neural networks: an in-depth exploration of learning in parameter space”, by M. Mele, R. Menichetti, A. Ingrosso, R. Potestio. @ai_ngrosso @MeleMargherita_ @UniTrento @Radboud_Uni.
0
3
17
@ai_ngrosso
Alessandro Ingrosso
9 months
RT @zdeborova: This mini-review is based on Hugo Cui's PhD thesis: . My advice to him was: "Write something you wo….
0
66
0
@ai_ngrosso
Alessandro Ingrosso
10 months
RT @carlolucibello: #RockinAI #Roccella Day 4 invited speakers: Alessandro Laio, Sebastian Goldt, Pietro Rotondo, Alessandro Ingrosso https….
0
3
0
@ai_ngrosso
Alessandro Ingrosso
11 months
I'll be leaving ICTP in September to start as Assistant Professor at @DondersInst, @Radboud_Uni in Nijmegen. Students interested in pursuing a PhD at the border of Machine Learning, Neuroscience and Statistical Mechanics, don't hesitate to contact me.
19
20
113
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli This work would not have happened without Federica Gerace and our friends Rosalba and Pietro. Looking forward to be in Roccella Jonica to talk about our work and eat a proper amount of nduja.
1
0
4
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli Using a previously developed synthetic model for source-target correlations (Correlated Hidden Manifold Model), we study how the similarity between tasks affects transfer and determines whether fine-tuning is helpful.
Tweet media one
1
0
3
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli The effectiveness of Transfer Learning in fully connected networks with one hidden layer can be analyzed in a Bayesian setting, varying the interaction strength between the source and target network.
Tweet media one
1
0
1
@ai_ngrosso
Alessandro Ingrosso
1 year
@fgerace_ @RosalbaPacelli Fine-tuning features extracted from a pre-trained, data-adundant source task can boost generalization on a target task where data is scarce. Combining replicas with kernel renormalization, we built a theoretical framework to analyze Transfer Learning in the proportional limit.
1
0
3
@ai_ngrosso
Alessandro Ingrosso
1 year
Not exactly thrilled but at least not unhappy to introduce single-instance Franz-Parisi, a new tool to investigate Transfer Learning in neural networks: Great collaboration with @fgerace_, @RosalbaPacelli and Pietro Rotondo.
Tweet media one
4
18
88
@ai_ngrosso
Alessandro Ingrosso
1 year
RT @LLogiaco: Back from this workshop, wonderfully organized by F. Mastrogiuseppe, @APalmigiano, @ai_ngrosso & @sebastiangoldt-thank you! L….
0
3
0