@armandjoulin
Armand Joulin
3 months
@giffmana @fouriergalois @fly51fly It s actually SwaV that introduced it
2
0
4

Replies

@fly51fly
fly51fly
3 months
[LG] Poly-View Contrastive Learning The paper presents a novel framework called Poly-View Contrastive Learning, challenging the conventional belief that contrastive learning requires large sample sizes and extensive training epochs to enhance…
Tweet media one
Tweet media two
3
11
49
@fouriergalois
Ω.KendrickPlumard
3 months
@fly51fly @giffmana seems huge and legit (Apple)
1
0
2
@giffmana
Lucas Beyer (bl16)
3 months
@fouriergalois @fly51fly seems reasonable. I actually did a similar experiment with similar conclusion in my thesis long time ago, see below :) Above paper is contrastive self-supervised, not image-text. I doubt it carries over as-is, but something like SILC uses the insight
Tweet media one
1
0
1
@giffmana
Lucas Beyer (bl16)
3 months
@fouriergalois @fly51fly yeah exactly, it's mostly for improving on dense downstream. I believe it's DINO that started it, at least we informally call it "adding the dino trick" :) However, I'm not sure if it's beneficial when controlling for the increased pre-train cost? Need to see a plot of that.
2
0
3
@armandjoulin
Armand Joulin
3 months
1
0
2
@giffmana
Lucas Beyer (bl16)
3 months
@armandjoulin @fouriergalois @fly51fly Ah thanks for the correction! I must admit that I kinda slept on SwaV, and I say this as a fan of most authors!
0
0
1