Nothing Special   »   [go: up one dir, main page]

Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars

Ryo Yoshida, Hiroshi Noji, Yohei Oseki


Abstract
In computational linguistics, it has been shown that hierarchical structures make language models (LMs) more human-like. However, the previous literature has been agnostic about a parsing strategy of the hierarchical models. In this paper, we investigated whether hierarchical structures make LMs more human-like, and if so, which parsing strategy is most cognitively plausible. In order to address this question, we evaluated three LMs against human reading times in Japanese with head-final left-branching structures: Long Short-Term Memory (LSTM) as a sequential model and Recurrent Neural Network Grammars (RNNGs) with top-down and left-corner parsing strategies as hierarchical models. Our computational modeling demonstrated that left-corner RNNGs outperformed top-down RNNGs and LSTM, suggesting that hierarchical and left-corner architectures are more cognitively plausible than top-down or sequential architectures. In addition, the relationships between the cognitive plausibility and (i) perplexity, (ii) parsing, and (iii) beam size will also be discussed.
Anthology ID:
2021.emnlp-main.235
Original:
2021.emnlp-main.235v1
Version 2:
2021.emnlp-main.235v2
Version 3:
2021.emnlp-main.235v3
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2964–2973
Language:
URL:
https://aclanthology.org/2021.emnlp-main.235
DOI:
10.18653/v1/2021.emnlp-main.235
Bibkey:
Cite (ACL):
Ryo Yoshida, Hiroshi Noji, and Yohei Oseki. 2021. Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2964–2973, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars (Yoshida et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.235.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.235.mp4
Code
 osekilab/rnng-eyetrack +  additional community code