CERD: A Comprehensive Chinese Rhetoric Dataset for Rhetorical Understanding and Generation in Essays

N Liu, X Chen, H Wu, C Sun, M Lan, Y Wu… - arXiv preprint arXiv …, 2024 - arxiv.org
N Liu, X Chen, H Wu, C Sun, M Lan, Y Wu, X Bai, S Mao, Y Xia
arXiv preprint arXiv:2409.19691, 2024arxiv.org
Existing rhetorical understanding and generation datasets or corpora primarily focus on
single coarse-grained categories or fine-grained categories, neglecting the common
interrelations between different rhetorical devices by treating them as independent sub-
tasks. In this paper, we propose the Chinese Essay Rhetoric Dataset (CERD), consisting of 4
commonly used coarse-grained categories including metaphor, personification, hyperbole
and parallelism and 23 fine-grained categories across both form and content levels. CERD …
Existing rhetorical understanding and generation datasets or corpora primarily focus on single coarse-grained categories or fine-grained categories, neglecting the common interrelations between different rhetorical devices by treating them as independent sub-tasks. In this paper, we propose the Chinese Essay Rhetoric Dataset (CERD), consisting of 4 commonly used coarse-grained categories including metaphor, personification, hyperbole and parallelism and 23 fine-grained categories across both form and content levels. CERD is a manually annotated and comprehensive Chinese rhetoric dataset with five interrelated sub-tasks. Unlike previous work, our dataset aids in understanding various rhetorical devices, recognizing corresponding rhetorical components, and generating rhetorical sentences under given conditions, thereby improving the author's writing proficiency and language usage skills. Extensive experiments are conducted to demonstrate the interrelations between multiple tasks in CERD, as well as to establish a benchmark for future research on rhetoric. The experimental results indicate that Large Language Models achieve the best performance across most tasks, and jointly fine-tuning with multiple tasks further enhances performance.
arxiv.org