Musk Viewer
About
Privacy Policy
Removal Request
Angelos Katharopoulos
@angeloskath
4 years
Code is also available! If you want to experiment with clustered attention all you need to do is pip install pytorch-fast-transformers and then use attention_type="improved-clustered". Enjoy!
François Fleuret
@francoisfleuret
4 years
One paper accepted at
@NeurIPSConf
with
@apoorv2904
and
@angeloskath
on speeding up attention by clustering the queries. The nice thing is that this can be used for inference with standard pre-trained models.
@Idiap_ch
@unige_en
@EPFL_en
@snsf_ch
2
18
110
1
18
57