Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
We show that with dependency parsing and rule-based rubrics, we can curate a high quality sentence relation task by leveraging explicit discourse relations. We ...
Oct 12, 2017 · We show that our curated dataset provides an excellent signal for learning vector representations of sentence meaning.
This paper performs additional pre-training on text tailored to discourse classification and tests three methods of adaptation of BERT to exploit implicit ...
DisSent In this model, connectives are used in the downstream task for learning sentence representation from explicit discourse relations. Nie et al. (2019) ...
Discourse markers annotate deep conceptual relations between sentences. Learning to predict them may thus represent a strong training task for sentence meanings ...
DiscSent (Jernite et al., 2017) and DisSent (Nie et al., 2017) both utilise the annotated explicit discourse relations, which is also good for learning sentence ...
Pre-trained Sentence Embeddings for Implicit Discourse Relation Classification · Pretraining with Contrastive Sentence Objectives Improves Discourse Performance ...
We show that with dependency parsing and rule-based rubrics, we can curate a high quality sentence relation task by leveraging explicit discourse relations. We ...
DisSent: Learning Sentence Representations from Explicit Discourse Relations · 1 code implementation • ACL 2019 • Allen Nie, Erin Bennett, Noah Goodman.
Learning to Explain: Answering Why-Questions via Rephrasing · DisSent: Sentence Representation Learning from Explicit Discourse Relations.