%0 Conference Proceedings %T Point Precisely: Towards Ensuring the Precision of Data in Generated Texts Using Delayed Copy Mechanism %A Li, Liunian %A Wan, Xiaojun %Y Bender, Emily M. %Y Derczynski, Leon %Y Isabelle, Pierre %S Proceedings of the 27th International Conference on Computational Linguistics %D 2018 %8 August %I Association for Computational Linguistics %C Santa Fe, New Mexico, USA %F li-wan-2018-point %X The task of data-to-text generation aims to generate descriptive texts conditioned on a number of database records, and recent neural models have shown significant progress on this task. The attention based encoder-decoder models with copy mechanism have achieved state-of-the-art results on a few data-to-text datasets. However, such models still face the problem of putting incorrect data records in the generated texts, especially on some more challenging datasets like RotoWire. In this paper, we propose a two-stage approach with a delayed copy mechanism to improve the precision of data records in the generated texts. Our approach first adopts an encoder-decoder model to generate a template text with data slots to be filled and then leverages a proposed delayed copy mechanism to fill in the slots with proper data records. Our delayed copy mechanism can take into account all the information of the input data records and the full generated template text by using double attention, position-aware attention and a pairwise ranking loss. The two models in the two stages are trained separately. Evaluation results on the RotoWire dataset verify the efficacy of our proposed approach to generate better templates and copy data records more precisely. %U https://aclanthology.org/C18-1089 %P 1044-1055