
Gennaro Gala
@gengala13
Followers
119
Following
786
Media
18
Statuses
55
AI researcher https://t.co/1y8D8lEshc
Joined January 2020
RT @drmichaellevin: I'm constantly irritated that I don't have time to read the torrent of cool papers coming faster and faster from amazin….
0
307
0
RT @gabriberton: I see a paper that looks interesting. I download the paper. Open the link to the code. "Code coming soon" one year ago. De….
0
19
0
RT @TacoCohen: Nobody wants to hear it, but working on data is more impactful than working on methods or architectures.
0
99
0
RT @EmilevanKrieken: We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, co….
0
104
0
RT @kchonyc: feeling a bit under the weather this week … thus an increased level of activity on social media and blog: .
0
115
0
RT @alfcnz: Scaling continuous latent variable models as probabilistic integral circuits @tetraduzione @gengala13
0
10
0
RT @tetraduzione: Less than two weeks to submit your papers on:. 📈 #lowrank adapters and #factorizations.🧊 #tensor networks.🔌 probabilistic….
april-tools.github.io
The AAAI Workshop on Connecting Low-Rank Representations in AI
0
15
0
RT @tetraduzione: factorizing a tensor with infinite dimensions⁉️.yes that's possible‼️. and this gives you tractable inference, and the si….
github.com
Contribute to gengala/ten-pics development by creating an account on GitHub.
0
6
0
This is joint work with amazing collaborators @cassiopc, @tetraduzione, and @equaeghe.Check out the paper.📜(preprint).📜(neurips). and code.🖥️
github.com
Contribute to gengala/ten-pics development by creating an account on GitHub.
0
1
8
In previous work, to keep training feasible, we only built tree-shaped PICs with simple univariate dependencies among latents. How can we build more intricate structures and allow for multivariate latent relationships while providing scalable training?.
Applying this process hierarchically to a PIC leads to a QPC, i.e. a PC encoding a numerical Quadrature process. Below, we illustrate the approximation of a tree-shaped PIC as a QPC, using N=3 integration points.
1
0
6
I’ll be attending #NeurIPS2024, where I’ll present our spotlight: Scaling Continuous Latent Variable Models as Probabilistic Integral Circuits (PICs). TL;DR: We learn continuous hierarchical mixtures as DAG-shaped PICs, and scale them using neural functional sharing techniques.
2
21
169
Lorenzo & team are really pushing the boundaries of tractable models, check out this banger!.
We learn more expressive mixture models that can subtract probability density by squaring them. 🚨We show squaring can reduce expressiveness. To tackle this we build sum of squares circuits🆘. 🚀We explain why complex parameters help, and show an expressiveness hierarchy around🆘
0
2
12