syz
@stephenz_y
Followers
555
Following
12K
Media
66
Statuses
767
grad student burner acc
somewhere in paris
Joined February 2018
12 terms you still hear in academia but nowhere else: sparse coding, basis pursuit, compressed sensing, wavelets, dictionary learning, data science, big data, data mining, sensor networks, formal methods, internet of things, cyber-physical systems
13
5
165
Okay I have a theory as to what kind of person gets one-shotted by GenAI into psychosis. It’s the people who never learned to do the busy work. The “idea people.” People who do the busy work know how the details work. Of course because that work sucks, they also ask AI to do
So many retards getting one shotted by AI is funny. GenAI does what you already know how to do it’s the busy work. You still need to think and come up with original ideas which was already hard to do before GenAI and is probably harder now that most people can’t think much at
62
130
2K
Finally, this paper has been published at IEEE Transactions on Information Theory. Very proud of Cale (now at Monash University) and Amanjit who made this happen. We are honoured to dedicate this work to Prof. Amari. https://t.co/aZBwq9J26f
ieeexplore.ieee.org
The Bregman-Wasserstein divergence is the optimal transport cost when the underlying cost function is given by a Bregman divergence, and arises naturally in fields such as statistics and machine...
New version on https://t.co/ZAEbHWNSm8. Couldn't have done it without help from my student Amanjit who joined as a co-author. Includes new implementations with neural OT. Figure shows our primal and dual displacement interpolations w.r.t. "transport KL-geometry" on the simplex.
1
1
6
We were lucky to be hosted by Michal and Marco who helped a lot with this! Check out our paper if you want to see the results or the math behind it. We also open source our semidiscrete solver in the OTT-JAX library. https://t.co/ftgueENC1R
https://t.co/R4JzorAMGV
github.com
Optimal transport tools implemented with the JAX framework, to solve large scale matching problems of any flavor. - ott-jax/ott
1
1
11
You only need to store one real number per data sample. You can precompute these numbers once using stochastic convex ✨optimization. Whenever you want to train a flow matching model, you assign noise to data using these numbers.
1
1
8
Super excited to share what @stephenz_y and I’ve been up to during our internship at🍎: Using optimal transport makes flows straighter and generation faster in flow matching, but small batch OT is biased and large batch OT is slow. What to do? Use semidiscrete OT! 🧵
3
39
269
the last time the internet truly felt like it wasnt just a trough full of slop, computers looked like this
415
501
10K
there's a weird kind of grief that comes with late august and i cant explain it
1K
30K
235K
GenAI isn't just a technology; it's an informational pollutant—a pervasive cognitive smog that touches and corrupts every aspect of the Internet. It's not just a productivity tool; it's a kind of digital acid rain, silently eroding the value of all information. Every image is no
469
1K
7K
It’s pretty incredible that FID, a metric introduced ~10 years ago, is still driving much of the progress of image generative modeling. Among many of its flaws, FID can be easily cheated on if a method completely memorizes the training set, and nothing more. Adding to the
5
20
210
@nellohead My computer is a big strong boy and doesn't need eepies
7
38
2K
Lots of hype around multimodal FMs, virtual cells (and labs?), all-atom design...I really think core algorithms (not just scale/integration) will solve the next problems in AIxBio. Take Transition Path Sampling: models transitions for dynamics, optimization, and cell fate. 👇
2
23
181
I’m an optimist by nature. When an outcome was not what I was hoping for, I often see it as an opportunity to learn and grow stronger. But sometimes, it gets easier to agree with Shakespeare: “life is a tale told by an idiot, full of sound and fury, signifying nothing”.
3
4
58
@SharonYixuanLi NeurIPS rebuttal = research triathlon: sprint for fresh baselines, marathon of ablations, high-jump over Reviewer 2’s existential dread. Survive the week and you’ve leveled up harder than any course can teach.
1
1
12
This is deeply concerning and not what we would expect from a high-quality conference like @NeurIPSConf! The rebuttal guidelines explicitly allowed one optional PDF before (see https://t.co/4X8LC14yZ5 as of July 16). Removing this option (even without prior notice) is not fair
@NeurIPSConf, why take the option to provide figures in the rebuttals away from the authors during the rebuttal period? Grounding the discussion in hard evidential data (like plots) makes resolving disagreements much easier for both the authors and the reviewers. Left: NeurIPS
1
3
13