Ferdinand Schlatt Profile
Ferdinand Schlatt

@fschlatt1

Followers
148
Following
124
Media
8
Statuses
78

PhD Student, efficient and effective neural IR models 🧠🔎

Joined October 2017
Don't wanna be here? Send us removal request.
@fschlatt1
Ferdinand Schlatt
7 days
RT @webis_de: Honored to win the ICTIR Best Paper Honorable Mention Award for "Axioms for Retrieval-Augmented Generation"!.Our new axioms a….
0
4
0
@fschlatt1
Ferdinand Schlatt
8 days
RT @ReNeuIRWorkshop: The fourth ReNeuIR Workshop @ #SIGIR2025 is about to start. Join us! Program of the day is here: .
reneuir.org
Workshop on Reaching Efficiency in Neural Information Retrieval
0
2
0
@fschlatt1
Ferdinand Schlatt
9 days
RT @webis_de: Happy to share that our paper "The Viability of Crowdsourcing for RAG Evaluation" received the Best Paper Honourable Mention….
0
6
0
@fschlatt1
Ferdinand Schlatt
9 days
Want to know how to make bi-encoders more than 3x faster with a new backbone encoder model? Check out our talk on the Token-Independent Text Encoder (TITE) #SIGIR2025 in the efficiency track. It pools vectors within the model to improve efficiency
Tweet media one
0
11
61
@fschlatt1
Ferdinand Schlatt
11 days
RT @MrParryParry: Really like this work, if you haven't read it yet, have a look: PSA from Ian Soboroff! move to K….
0
1
0
@fschlatt1
Ferdinand Schlatt
11 days
RT @ir_glasgow: Now it’s @MrParryParry presenting the reproducibility efforts of a large team of researchers in relation to the shelf life….
0
5
0
@fschlatt1
Ferdinand Schlatt
12 days
Thank you @cadurosar for shout-out of Lightning IR in the LSR tutorial at #SIGIR2025. If you want to fine your own LSR models, check out our framework at
Tweet media one
0
2
12
@fschlatt1
Ferdinand Schlatt
4 months
@maik_froebe @hscells @ShengyaoZhuang @bevan_koopman @guidozuc @bennostein @martinpotthast @matthias_hagen Short: Rank-DistiLLM: Closing the Effectiveness Gap Between Cross-Encoders and LLMs for Passage Re-ranking Full: Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders
Tweet media one
0
7
29
@fschlatt1
Ferdinand Schlatt
4 months
What an honor to receive both the best short paper award and the best paper honourable mention award at #ECIR2025. Thank you to all the co-authors @maik_froebe @hscells @ShengyaoZhuang @bevan_koopman @guidozuc @bennostein @martinpotthast @matthias_hagen 🥳.
4
6
43
@fschlatt1
Ferdinand Schlatt
4 months
RT @antonio_mallia: It was a really pleasant surprise to learn that our paper “Efficient Constant-Space Multi-Vector Retrieval” aka ConstBE….
0
8
0
@fschlatt1
Ferdinand Schlatt
4 months
Next up at #ECIR2025, @maik_froebe presenting his fantastic work on corpus sub sampling and how to more efficiently evaluate retrieval systems
Tweet media one
0
4
24
@fschlatt1
Ferdinand Schlatt
4 months
RT @MrParryParry: Now accepted at #SIGIR2025! looking forward to discussing evaluation with LLMs at #ECIR2025 this week and of course in Pa….
0
7
0
@fschlatt1
Ferdinand Schlatt
4 months
RT @tomaarsen: I've just ported the excellent monoELECTRA-{base, large} reranker models from @fschlatt1 & the research network Webis Group….
0
18
0
@fschlatt1
Ferdinand Schlatt
5 months
RT @macavaney: We re-annotated DL’19. The results? 👇.
0
1
0
@fschlatt1
Ferdinand Schlatt
5 months
RT @MrParryParry: 🚨 New Pre-Print! 🚨 Reviewer 2 has once again asked for DL’19, what can you say in rebuttal?  We have re-annotated DL’19 i….
0
11
0
@fschlatt1
Ferdinand Schlatt
9 months
Tight integration with the library allows easy access to a wide range of IR datasets, including MS MARCO, various TREC collections, BEIR, and more. Lightning IR also integrates for quick evaluation on these datasets.
0
0
4
@fschlatt1
Ferdinand Schlatt
9 months
Several pre-trained models are available out of the box. Check the model zoo for an overview. Some models were natively trained with Lightning IR, but integrating external checkpoints is also a breeze. Model zoo: 🦓🦘🦁.
1
0
4
@fschlatt1
Ferdinand Schlatt
9 months
Lightning IR implements bi-encoder and cross-encoder models as configurable and extensible wrappers around Hugging Face backbone models. Want to fine-tune a T5-based Col(BERT) model or a monoDeBERTa cross-encoder? Lightning IR has you covered.
1
0
4
@fschlatt1
Ferdinand Schlatt
9 months
Lightning IR builds on top of PyTorch Lightning to provide scalable and reproducible fine-tuning and inference of neural ranking models. Next to the usual PyTorch Lightning features, Lightning IR provides support for indexing, searching, and re-ranking.
1
0
4
@fschlatt1
Ferdinand Schlatt
9 months
Happy to share our framework for fine-tuning and running neural ranking models, Lightning IR, was accepted as a demo at #WSDM25 🥳. Pre-print: .Code: Docs: A quick rundown of Lightning IR's main features:.
Tweet card summary image
github.com
One-stop shop for running and fine-tuning transformer-based language models for retrieval - webis-de/lightning-ir
2
7
47