Gennaro Gala Profile
Gennaro Gala

@gengala13

Followers
119
Following
786
Media
18
Statuses
55

AI researcher https://t.co/1y8D8lEshc

Joined January 2020
Don't wanna be here? Send us removal request.
@gengala13
Gennaro Gala
10 days
RT @drmichaellevin: I'm constantly irritated that I don't have time to read the torrent of cool papers coming faster and faster from amazin….
0
307
0
@gengala13
Gennaro Gala
14 days
RT @PMinervini: "in 2025 we will have flying cars" 😂😂😂
Tweet media one
0
1K
0
@gengala13
Gennaro Gala
2 months
RT @gabriberton: I see a paper that looks interesting. I download the paper. Open the link to the code. "Code coming soon" one year ago. De….
0
19
0
@gengala13
Gennaro Gala
2 months
RT @TacoCohen: Nobody wants to hear it, but working on data is more impactful than working on methods or architectures.
0
99
0
@gengala13
Gennaro Gala
2 months
RT @EmilevanKrieken: We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, co….
0
104
0
@gengala13
Gennaro Gala
7 months
RT @kchonyc: feeling a bit under the weather this week … thus an increased level of activity on social media and blog: .
0
115
0
@gengala13
Gennaro Gala
7 months
RT @alfcnz: Scaling continuous latent variable models as probabilistic integral circuits @tetraduzione @gengala13
Tweet media one
0
10
0
@gengala13
Gennaro Gala
8 months
RT @tetraduzione: Less than two weeks to submit your papers on:. 📈 #lowrank adapters and #factorizations.🧊 #tensor networks.🔌 probabilistic….
april-tools.github.io
The AAAI Workshop on Connecting Low-Rank Representations in AI
0
15
0
@gengala13
Gennaro Gala
9 months
RT @tetraduzione: factorizing a tensor with infinite dimensions⁉️.yes that's possible‼️. and this gives you tractable inference, and the si….
Tweet media one
github.com
Contribute to gengala/ten-pics development by creating an account on GitHub.
0
6
0
@gengala13
Gennaro Gala
9 months
This is joint work with amazing collaborators @cassiopc, @tetraduzione, and @equaeghe.Check out the paper.📜(preprint).📜(neurips). and code.🖥️
Tweet media one
github.com
Contribute to gengala/ten-pics development by creating an account on GitHub.
0
1
8
@gengala13
Gennaro Gala
9 months
In extensive distribution estimation benchmarks, QPCs materialized from PICs with functional sharing systematically outperform standard PCs commonly learned via EM or Adam/SGD.
Tweet media one
1
0
5
@gengala13
Gennaro Gala
9 months
In practice, PICs with functional sharing (green), unlike those w/o (orange), need the same resources as PCs (blue), and up to 99% less params!.We plot the param num. for PCs (K is layer size), and for PICs (M is MLP size): PICs with sharing hit 5M params, while the others 2B!
Tweet media one
1
0
5
@gengala13
Gennaro Gala
9 months
Materializing QPCs is expensive when function evaluation is costly, so we propose neural functional sharing: We parameterize all integral units at the same depth with a multi-headed MLP. We show this for 4 bivariate functions, that we discretize to what we call CP folded layer.
Tweet media one
1
0
6
@gengala13
Gennaro Gala
9 months
Zooming-in the QPC materialization, we show how the above function f4 can be discretized via numerical quadrature and used to parameterize a Tucker layer. The two gaussian blocks are just vectors, which get multiplied via an outer product that is then matrix-multiplied by \tildeW
Tweet media one
1
0
6
@gengala13
Gennaro Gala
9 months
We propose a pipeline that from arbitrary variable decompositions (1) builds DAG-shaped PICs (2), that we train by materializing them as tensorized circuits (aka tensor networks) called Quadrature-PCs (QPCs) (3), which we also fold to allow fast inference (4).
Tweet media one
1
0
7
@gengala13
Gennaro Gala
9 months
In previous work, to keep training feasible, we only built tree-shaped PICs with simple univariate dependencies among latents. How can we build more intricate structures and allow for multivariate latent relationships while providing scalable training?.
@gengala13
Gennaro Gala
1 year
Applying this process hierarchically to a PIC leads to a QPC, i.e. a PC encoding a numerical Quadrature process. Below, we illustrate the approximation of a tree-shaped PIC as a QPC, using N=3 integration points.
Tweet media one
1
0
6
@gengala13
Gennaro Gala
9 months
Background: PICs are symbolic computational graphs over positive functions, and allow to build and represent continuous hierarchical mixtures. They are an extension of probabilistic circuits (PCs), as they use a new type of computation unit: the integral unit.
Tweet media one
1
1
8
@gengala13
Gennaro Gala
9 months
I’ll be attending #NeurIPS2024, where I’ll present our spotlight: Scaling Continuous Latent Variable Models as Probabilistic Integral Circuits (PICs). TL;DR: We learn continuous hierarchical mixtures as DAG-shaped PICs, and scale them using neural functional sharing techniques.
Tweet media one
2
21
169
@gengala13
Gennaro Gala
10 months
Lorenzo & team are really pushing the boundaries of tractable models, check out this banger!.
@loreloc_
Lorenzo Loconte
10 months
We learn more expressive mixture models that can subtract probability density by squaring them. 🚨We show squaring can reduce expressiveness. To tackle this we build sum of squares circuits🆘. 🚀We explain why complex parameters help, and show an expressiveness hierarchy around🆘
0
2
12