
Benjie Wang
@benjiewang_cs
Followers
96
Following
185
Media
9
Statuses
27
Postdoc at UCLA StarAI Lab @UCLAComSci
Joined November 2019
Also check out the awesome paper "Sum of Squares Circuits" ( by @loreloc_, Stefan Mengel, and @tetraduzione, which concurrently showed the separation between monotone and squared circuits. Also at AAAI 2025 today poster #840!.
0
0
2
Circuits use sum-product computation graphs to model probability densities. But how do we ensure the non-negativity of the output?. Check out our poster "On the Relationship between Monotone and Squared Probabilistic Circuits" at AAAI 2025 **today**: 12:30pm-14:30pm #841.
1
1
8
RT @danielmisrael: “That’s one small [MASK] for [MASK], a giant [MASK] for mankind.” – [MASK] Armstrong. Can autoregressive models predict….
0
8
0
Thanks to my amazing co-authors Denis Mauá, @guyvdb, and YooJung Choi. Hope to see you at the poster session!.
0
0
1
You have some model/knowledge (e.g. Bayes Net, Probabilistic/Logic Program, DB) and some query (e.g. MAP, Causal Adjustment) you want to ask. When can you compute this efficiently? . Find out @ NeurIPS today in Poster Session 6 East, #3801. Paper:
1
3
14
RT @HonghuaZhang2: So excited to present Ctrl-G **Adaptable Logical Control for Large Language Models** TODAY at #NeurIPS2024 West Ballroom….
0
2
0
RT @zhezeng0908: 📢 I’m recruiting PhD students @CS_UVA for Fall 2025!.🎯 Neurosymbolic AI, probabilistic ML, trustworthiness, AI for science….
0
72
0
RT @e_giunchiglia: 🚨 Exciting Opportunity! 🚨. I’m looking for PhD students to join my team @ImperialEEE and @ImperialX_AI! 🌍🔍. Research Top….
0
36
0
Excited to share our work on LLM tokenization, led by the awesome @renatogeh. We find significant boosts in downstream performance, by probabilistically interpreting the space of tokenizations of a text. A bit of probabilistic reasoning goes a long way!.
Where is the signal in LLM tokenization space?. Does it only come from the canonical (default) tokenization?. The answer is no! By looking at other ways to tokenize the same text, we get a consistent boost to LLM performance!. 1/5
0
4
8
Super cool work on discretizing probability distributions with *exponential* gains in succinctness! Recommended reading for probabilistic inference folks.
Are you looking for an inference algorithm that supports your discrete-continuous probabilistic program? Look no further! We have developed a new probabilistic programming language (PPL) called HyBit that provides scalable support for discrete-continuous probabilistic programs.
0
2
13
RT @ZhijingJin: We will organize a "Causality for LLMs" Tutorial #NeurIPS2024 @NeurIPSConf. Happy to contribute to our community an intro o….
0
55
0