Click here to switch to the current Fair Ranking Track webpage.
The TREC Fair Ranking track evaluates systems according to how well they fairly rank documents. The 2020 focuses on scholarly search and fairly ranking academic abstracts and papers from authors belonging to different groups.
You can find all the information about the 2019 TREC Fair Ranking track here.
TIMELINE
May 19, 2020: guidelines released.
July 1, 2020: training queries and corpus released
August 17, 2020: evaluation queries released
August 28, 2020: submissions due
late September, 2020: evaluated submissions returned
TREC 2020 Fair Ranking Track evaluation data: eval queries are in the TREC-Fair-Ranking-eval-sample-no-rel.json file, while the eval sequences are in TREC-Fair-Ranking-eval-seq.csv. For the retrieval task, please retrieve the results from the subset of OpenCorpus provided with the training data. Note: by downloading the data samples contained in this package, you agree to the Semantic Scholar Dataset License Agreement.
General references to the TREC Fair Ranking track should use the following citation,
@inproceedings{trec-fair-ranking-2019,
Author = {Asia J. Biega and Fernando Diaz and Michael D. Ekstrand and Sebastian Kohlmeier},
Booktitle = {The Twenty-Eighth Text REtrieval Conference (TREC 2019) Proceedings},
Title = {Overview of the TREC 2019 Fair Ranking Track},
Year = {2019}}
If you use a specific dataset, please cite the notebook paper associated with the data.