Generating logical forms from graph representations of text and entities
arXiv preprint arXiv:1905.08407, 2019•arxiv.org
Structured information about entities is critical for many semantic parsing tasks. We present
an approach that uses a Graph Neural Network (GNN) architecture to incorporate
information about relevant entities and their relations during parsing. Combined with a
decoder copy mechanism, this approach provides a conceptually simple mechanism to
generate logical forms with entities. We demonstrate that this approach is competitive with
the state-of-the-art across several tasks without pre-training, and outperforms existing …
an approach that uses a Graph Neural Network (GNN) architecture to incorporate
information about relevant entities and their relations during parsing. Combined with a
decoder copy mechanism, this approach provides a conceptually simple mechanism to
generate logical forms with entities. We demonstrate that this approach is competitive with
the state-of-the-art across several tasks without pre-training, and outperforms existing …
Structured information about entities is critical for many semantic parsing tasks. We present an approach that uses a Graph Neural Network (GNN) architecture to incorporate information about relevant entities and their relations during parsing. Combined with a decoder copy mechanism, this approach provides a conceptually simple mechanism to generate logical forms with entities. We demonstrate that this approach is competitive with the state-of-the-art across several tasks without pre-training, and outperforms existing approaches when combined with BERT pre-training.
arxiv.org