The TREC Fair Ranking track evaluates systems according to how well they fairly rank documents.
The 2022 track focuses on fairly prioritising Wikimedia articles for editing to provide a fair exposure to articles from different groups.
Timeline
- May, 2022: guidelines released.
- June, 2022: training queries and corpus released
- July, 2022: evaluation queries released
- 31st August, 2022: submissions due
- September, 2022: evaluated submissions returned
Downloads
The TREC 2022 Fair Ranking Track participation guidelines, experimentation protocol, data and evaluation scripts will be made available here.
Resources
- How To TREC
- Google Group
- Slack channel
- 2019 TREC Fair Ranking (archive)
- 2020 TREC Fair Ranking (archive)
- 2021 TREC Fair Ranking (archive)
- NTCIR 2023 Fair Web track — if you want more fair ranking!
Citation
References to the TREC 2021 Fair Ranking Track should use the following citation:
@inproceedings{trec-fair-ranking-2021,
Author = {Michael D. Ekstrand and Graham McDonald and Amifa Raj and Isaac Johnson},
Booktitle = {The Thirtieth Text REtrieval Conference (TREC 2021) Proceedings},
Title = {Overview of the TREC 2021 Fair Ranking Track},
Year = {2022}
}
If you use a specific dataset, please cite the notebook paper associated with the data.
Organizers

People and Information Research Team (PIReT)
Boise State University

Information Retrieval Group
University of Glasgow

People and Information Research Team (PIReT)
Boise State University

Wikimedia Foundation