lucastheis Profile Banner
Lucas Theis Profile
Lucas Theis

@lucastheis

Followers
3K
Following
6K
Media
51
Statuses
428

Building something new. Previously @GoogleDeepMind, @twitter, Magic Pony, @bethgelab.

London
Joined August 2009
Don't wanna be here? Send us removal request.
@lucastheis
Lucas Theis
1 year
What does it mean for an image, video, or text to be π‘Ÿπ‘’π‘Žπ‘™π‘–π‘ π‘‘π‘–π‘?. Despite how far we've come in π‘”π‘’π‘›π‘’π‘Ÿπ‘Žπ‘‘π‘–π‘›π‘” realistic data, π‘žπ‘’π‘Žπ‘›π‘‘π‘–π‘“π‘¦π‘–π‘›π‘” realism is still a poorly understood problem. I've shared my thoughts on how to correctly quantify realism here:.
2
21
132
@lucastheis
Lucas Theis
1 year
RT @sedielem: #ICML2024 diffusion circle πŸ“’: let's meet at 3PM on Thursday at registration, near the Vienna tourism desk. We'll find a spot….
0
17
0
@lucastheis
Lucas Theis
1 year
RT @elluba: The @CVPR AI Art Gallery is now live πŸ€–πŸŽ¨. Featuring 115+ artworks using or about computer vision😎. View them here: https://t.co/….
0
31
0
@lucastheis
Lucas Theis
1 year
I am at CVPR to give a talk at the AI for Streaming workshop ( and to present our paper on C3. Let me know if you’re in Seattle and would like to catch up. #CVPR2024.
2
2
21
@lucastheis
Lucas Theis
1 year
RT @mhutter42: Don't miss this ICML Spotlight Talk by one of my colleagues, and/or even better read his very nice exposition illustrating w….
0
5
0
@lucastheis
Lucas Theis
1 year
RT @emidup: We build neural codecs from a *single* image or video, achieving compression performance close to SOTA models trained on large….
0
35
0
@lucastheis
Lucas Theis
2 years
RT @Official_CLIC: CLIC is back! Details are now online @ The validation phase starts in 3 weeks. Get ready!🏎️.
0
6
0
@lucastheis
Lucas Theis
2 years
RT @karen_ullrich: 🀩 Tomorrow the #ICML2023 workshops shall begin.πŸŽ™οΈ Join me at 9 AM I will discuss absolutely all there is to know about #….
0
7
0
@lucastheis
Lucas Theis
2 years
RT @george_toderici: CLIC (Challenge on Learned Image Compression) is not dead. Stay tuned for some exciting news.
0
2
0
@lucastheis
Lucas Theis
2 years
0
0
4
@lucastheis
Lucas Theis
2 years
Arbitrary resolutions are handled by generating images patch by patch. This makes it easy to use network architectures that don't scale or generalize well to arbitrary resolutions (e.g., attention layers).
Tweet media one
1
0
5
@lucastheis
Lucas Theis
2 years
On the other hand, we show that a simple two-stage diffusion approach can be made to work very well with minor tweaks (e.g., to the noise schedule).
Tweet media one
1
0
2
@lucastheis
Lucas Theis
2 years
As others have found, beating well-tuned adversarial approaches to compression (e.g., HiFiC, PO-ELIC) with diffusion is challenging. Repurposing text-to-image models (e.g., Stable Diffusion, Imagen) does not work well as these have not been designed to reproduce fine details.
Tweet media one
1
1
2
@lucastheis
Lucas Theis
2 years
New paper on high-resolution image compression with diffusion models, achieving SoTA results in terms of FID:.
Tweet media one
1
21
115
@lucastheis
Lucas Theis
2 years
RT @_akhaliq: High-Fidelity Image Compression with Score-based Generative Models. Despite the tremendous success of diffusion generative mo….
0
16
0
@lucastheis
Lucas Theis
2 years
For a distribution which is heavy-tailed enough, you should expect to wait forever.
0
0
1
@lucastheis
Lucas Theis
2 years
This can be viewed as an extreme version of the waiting paradox. Even when the average distance between buses is the same, the waiting time can differ significantly depending on the π‘‘π‘–π‘ π‘‘π‘Ÿπ‘–π‘π‘’π‘‘π‘–π‘œπ‘› over distances.
1
0
0
@lucastheis
Lucas Theis
2 years
Worse, the 𝑒π‘₯𝑝𝑒𝑐𝑑𝑒𝑑 runtime can be infinite when averaged over target distributions.
1
0
0
@lucastheis
Lucas Theis
2 years
Surprisingly, although the coding cost of "greedy" rejection sampling (Harsha et al., 2008) tends to be much lower than rejection sampling, the runtime is identical, namely exponential in the ∞-divergence between the proposal and target distribution.
1
0
0
@lucastheis
Lucas Theis
2 years
New paper with Greg Flamich on how to communicate samples efficiently (or π‘β„Žπ‘Žπ‘›π‘›π‘’π‘™ π‘ π‘–π‘šπ‘’π‘™π‘Žπ‘‘π‘–π‘œπ‘›). In addition to proposing a fast new scheme, working on this paper clarified a few things for me about the runtime of existing schemes.
Tweet card summary image
arxiv.org
We consider channel simulation protocols between two communicating parties, Alice and Bob. First, Alice receives a target distribution $Q$, unknown to Bob. Then, she employs a shared coding...
1
3
20