Ferdinand Schlatt

PhD Student at the DBIS Group at the FSU Jena.

prof_pic.jpg

Raum 3233

Ernst-Abbe-Platz 2

07743 Jena

My research is focused on using pre-trained transformer-based language models for information retrieval. I am particularly interested in augmenting the attention mechanism to better align with retrieval tasks in order to imporove the effectiveness and/or efficiency of cross-encoders and bi-encoders.

News

Selected publications

  1. TITE: Token-Independent Text Encoder for Information Retrieval
    Ferdinand Schlatt, Tim Hagen, Martin Potthast, and 1 more author
    In Proceedings of SIGIR 2025, Jul 2025
  2. Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders
    Ferdinand Schlatt, Maik Fröbe, Harrisen Scells, and 6 more authors
    In Proceedings of ECIR 2025, Apr 2025
  3. Lightning IR: Straightforward Fine-tuning and Inference of Transformer-based Language Models for Information Retrieval
    Ferdinand Schlatt, Maik Fröbe, and Matthias Hagen
    In Proceedings of WSDM 2025, Mar 2025
  4. Investigating the Effects of Sparse Attention on Cross-Encoders
    Ferdinand Schlatt, Maik Fröbe, and Matthias Hagen
    In Proceedings of ECIR 2024, Mar 2024