Nothing Special   »   [go: up one dir, main page]

Long Document Ranking with Query-Directed Sparse Transformer

Jyun-Yu Jiang, Chenyan Xiong, Chia-Jung Lee, Wei Wang


Abstract
The computing cost of transformer self-attention often necessitates breaking long documents to fit in pretrained models in document ranking tasks. In this paper, we design Query-Directed Sparse attention that induces IR-axiomatic structures in transformer self-attention. Our model, QDS-Transformer, enforces the principle properties desired in ranking: local contextualization, hierarchical representation, and query-oriented proximity matching, while it also enjoys efficiency from sparsity. Experiments on four fully supervised and few-shot TREC document ranking benchmarks demonstrate the consistent and robust advantage of QDS-Transformer over previous approaches, as they either retrofit long documents into BERT or use sparse attention without emphasizing IR principles. We further quantify the computing complexity and demonstrates that our sparse attention with TVM implementation is twice more efficient that the fully-connected self-attention. All source codes, trained model, and predictions of this work are available at https://github.com/hallogameboy/QDS-Transformer.
Anthology ID:
2020.findings-emnlp.412
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4594–4605
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.412
DOI:
10.18653/v1/2020.findings-emnlp.412
Bibkey:
Cite (ACL):
Jyun-Yu Jiang, Chenyan Xiong, Chia-Jung Lee, and Wei Wang. 2020. Long Document Ranking with Query-Directed Sparse Transformer. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4594–4605, Online. Association for Computational Linguistics.
Cite (Informal):
Long Document Ranking with Query-Directed Sparse Transformer (Jiang et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.412.pdf
Code
 hallogameboy/QDS-Transformer
Data
MS MARCO