%0 Conference Proceedings %T Towards Less Generic Responses in Neural Conversation Models: A Statistical Re-weighting Method %A Liu, Yahui %A Bi, Wei %A Gao, Jun %A Liu, Xiaojiang %A Yao, Jian %A Shi, Shuming %Y Riloff, Ellen %Y Chiang, David %Y Hockenmaier, Julia %Y Tsujii, Jun’ichi %S Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing %D 2018 %8 oct nov %I Association for Computational Linguistics %C Brussels, Belgium %F liu-etal-2018-towards-less %X Sequence-to-sequence neural generation models have achieved promising performance on short text conversation tasks. However, they tend to generate generic/dull responses, leading to unsatisfying dialogue experience. We observe that in the conversation tasks, each query could have multiple responses, which forms a 1-to-n or m-to-n relationship in the view of the total corpus. The objective function used in standard sequence-to-sequence models will be dominated by loss terms with generic patterns. Inspired by this observation, we introduce a statistical re-weighting method that assigns different weights for the multiple responses of the same query, and trains the common neural generation model with the weights. Experimental results on a large Chinese dialogue corpus show that our method improves the acceptance rate of generated responses compared with several baseline models and significantly reduces the number of generated generic responses. %R 10.18653/v1/D18-1297 %U https://aclanthology.org/D18-1297 %U https://doi.org/10.18653/v1/D18-1297 %P 2769-2774