
Lorenzo Loconte
@loreloc_
Followers
531
Following
4K
Media
29
Statuses
152
PhD Student @ University of Edinburgh
Edinburgh, Scotland
Joined March 2017
We learn more expressive mixture models that can subtract probability density by squaring them. ๐จWe show squaring can reduce expressiveness. To tackle this we build sum of squares circuits๐. ๐We explain why complex parameters help, and show an expressiveness hierarchy around๐
3
33
165
RT @EurIPSConf: EurIPS is coming! ๐ฃ Mark your calendar for Dec. 2-7, 2025 in Copenhagen ๐
. EurIPS is a community-organized conference whereโฆ.
0
26
0
RT @c_gregucci: Spotlight poster coming soon at #ICML2025 @icmlconf!.๐East Exhibition Hall A-B E-1806.๐๏ธWed 16 Jul 4:30 p.m. PDT โ 7 p.m.โฆ.
0
6
0
RT @ema_marconato: ๐งตWhy are linear properties so ubiquitous in LLM representations?. We explore this question through the lens of ๐ถ๐ฑ๐ฒ๐ป๐๐ถ๐ณ๐ถ๐ฎโฆ.
0
61
0
RT @EmilevanKrieken: We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, coโฆ.
0
105
0
RT @LennertDS_: Just under 10 days left to submit your latest endeavours in โก#tractableโก probabilistic modelsโ. Join us at TPM @auai.org #Uโฆ.
0
4
0
RT @diegocalanzone: In LoCo-LMs, we propose a neuro-symbolic loss function to fine-tune a LM to acquire logically consistent knowledge froโฆ.
arxiv.org
Large language models (LLMs) are a promising venue for natural language understanding and generation. However, current LLMs are far from reliable: they are prone to generating non-factual...
0
4
0
RT @e_giunchiglia: ๐จNew at #ICLR: we introduce the first ever ๐ฅ๐๐ฒ๐๐ซ that makes ๐๐ง๐ฒ neural network ๐๐จ๐ฆ๐ฉ๐ฅ๐ข๐๐ง๐ญ ๐๐ฒ ๐๐๐ฌ๐ข๐ ๐ง with constraints exprโฆ.
0
8
0
RT @PMinervini: @chaitjo @benfinkelshtein @ffabffrasca @mmbronstein @phanein @michael_galkin @chrsmrrs hi, we found problematic benchmarksโฆ.
arxiv.org
Complex query answering (CQA) on knowledge graphs (KGs) is gaining momentum as a challenging reasoning task. In this paper, we show that the current benchmarks for CQA might not be as complex as...
0
5
0
You can find the speakers bios and the abstracts of presentations here: Check them out!.
april-tools.github.io
The AAAI Workshop on Connecting Low-Rank Representations in AI
0
0
1
After lunch break, Andrew G. Wilson (@andrewgwils) is now giving his presentation on the importance of linear algebra structures in ML, as well as on how to navigate such structures in practice.
1
0
3
Live from the CoLoRAI workshop at AAAI. @nadavcohen is now giving his talk on "What Makes Data Suitable for Deep Learning?".Tools from quantum physics are shown to be useful in building more expressive deep learning models by changing the data distribution
1
1
6
RT @benjiewang_cs: Circuits use sum-product computation graphs to model probability densities. But how do we ensure the non-negativity of tโฆ.
0
1
0
We are going to present our poster "Sum of Squares Circuits" at AAAI in Philadelphia today. Hall E 12:30pm-14:00pm poster #840. We trace expressiveness connections of different types of additive and subtractive deep mixture models and tensor networks. ๐
arxiv.org
Designing expressive generative models that support exact and efficient inference is a core question in probabilistic ML. Probabilistic circuits (PCs) offer a framework where this...
We learn more expressive mixture models that can subtract probability density by squaring them. ๐จWe show squaring can reduce expressiveness. To tackle this we build sum of squares circuits๐. ๐We explain why complex parameters help, and show an expressiveness hierarchy around๐
0
6
25
RT @GabVenturato: ๐ฅ Can AI reason over time while following logical rules in relational domains? We will present Relational Neurosymbolic Mโฆ.
0
4
0