Nothing Special   »   [go: up one dir, main page]

Formality Style Transfer with Shared Latent Space

Yunli Wang, Yu Wu, Lili Mou, Zhoujun Li, WenHan Chao


Abstract
Conventional approaches for formality style transfer borrow models from neural machine translation, which typically requires massive parallel data for training. However, the dataset for formality style transfer is considerably smaller than translation corpora. Moreover, we observe that informal and formal sentences closely resemble each other, which is different from the translation task where two languages have different vocabularies and grammars. In this paper, we present a new approach, Sequence-to-Sequence with Shared Latent Space (S2S-SLS), for formality style transfer, where we propose two auxiliary losses and adopt joint training of bi-directional transfer and auto-encoding. Experimental results show that S2S-SLS (with either RNN or Transformer architectures) consistently outperforms baselines in various settings, especially when we have limited data.
Anthology ID:
2020.coling-main.203
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2236–2249
Language:
URL:
https://aclanthology.org/2020.coling-main.203
DOI:
10.18653/v1/2020.coling-main.203
Bibkey:
Cite (ACL):
Yunli Wang, Yu Wu, Lili Mou, Zhoujun Li, and WenHan Chao. 2020. Formality Style Transfer with Shared Latent Space. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2236–2249, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Formality Style Transfer with Shared Latent Space (Wang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.203.pdf
Code
 jimth001/formality_style_transfer_with_shared_latent_space
Data
GYAFC