Click here to switch to the current Fair Ranking Track webpage.
The TREC Fair Ranking track evaluates systems according to how well they fairly rank documents. The 2021 track focuses on fairly prioritising Wikimedia articles for editing to provide a fair exposure to articles from different groups.
Timeline
- April/May, 2021: guidelines released.
- June/July, 2021: training queries and corpus released
- July, 2021: evaluation queries released
- August 2, 2021: submissions due
- September, 2021: evaluated submissions returned
Downloads
The TREC 2021 Fair Ranking Track participation guidelines, experimentation protocol, data and evaluation scripts will be available here.
- Participant Instructions
- Corpus (direct link, 15GB)
- Training topics File (direct link, 54MB)
- Evaluation topics file (direct link)
- Metadata File (direct link, 6.9MB)
The files are also available from a Globus repository. Most US research universities, and many other institutions, support Globus; you can also download the files with Globus Connect Personal. You can also access the file repository by HTTP through the Boise State data repository.
Resources
- How To TREC
- Google Group
- Slack channel
- 2019 TREC Fair Ranking (archive)
- 2020 TREC Fair Ranking (archive)
Citation
References to the TREC 2021 Fair Ranking Track should use the following citation:
@inproceedings{trec-fair-ranking-2021,
Author = {Michael D. Ekstrand and Graham McDonald and Amifa Raj and Isaac Johnson},
Booktitle = {The Thirtieth Text REtrieval Conference (TREC 2021) Proceedings},
Title = {Overview of the TREC 2021 Fair Ranking Track},
Year = {2022}
}
If you use a specific dataset, please cite the notebook paper associated with the data.