Open-Source Information Retrieval Courses @ TU Wien
-
Updated
Jun 12, 2023 - Python
Open-Source Information Retrieval Courses @ TU Wien
Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Unified Learned Sparse Retrieval Framework
Source code for: On the Effect of Low-Frequency Terms on Neural-IR Models, SIGIR'19
Neural-IR-Explorer: A Content-Focused Tool to Explore Neural Re-Ranking Results
SIGIR 2023: Adapting Learned Sparse Retrieval to Long Documents
"Exploring the Effectiveness of Multi-stage Fine-tuning for Cross-encoder Re-rankers", ECIR 2025.
"Document Quality Scoring for Web Crawling", WOWS 2025.
Beyond Redundancy: Embedding-Aware Novelty Reranking in Retrieval-Augmented Generation
Fine-tuning of transformer-based models to predict semantically relevant outlinks in large-scale web crawling. Includes binary classification, score estimation with QualT5, and a multimodal BERT architecture integrating textual and metadata features for robust frontier prioritisation.
Master Thesis on reproducibility and interpretability of neural ranking models
neural information retrieval and ranking & extractive question answering
Add a description, image, and links to the neural-ir topic page so that developers can more easily learn about it.
To associate your repository with the neural-ir topic, visit your repo's landing page and select "manage topics."