Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
FPT: Feature Prompt Tuning for Few-shot Readability Assessment is a technique that helps language models assess the readability of text, even when there is only a small amount of labeled training data available. Readability assessment is the task of determining how easy or difficult a piece of text is to understand.
Apr 10, 2024
Apr 3, 2024 · We propose a novel prompt-based tuning framework that incorporates rich linguistic knowledge, called Feature Prompt Tuning (FPT).
Apr 3, 2024 · The hard prompt tuning applies a template with [MASK] token to the original input and maps the predicted label word to the corresponding class ...
Figure 1: Comparison of previous prompt tuning frame- works and our proposed Feature Prompt Tuning (FPT). T(·) and verbalizer(·) denote the template and ...
Repository files navigation. README. Code for NAACL2024 main conference paper FPT: Feature Prompt Tuning for Few-shot Readability Assessment. About. For RA ...
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-shot text classification by employing task-specific prompts.
为了解决这些问题,我们提出了一种新的基于提示的调整框架,该框架融合了丰富的语言学知识,称为特征提示调整(FPT)。具体来说,我们从文本中提取语言学特征,并将它们嵌入可训练 ...
Feature Prompt Tuning: Enhancing Readability Assessment with Linguistic Knowledge. A novel prompt-based tuning framework, Feature Prompt Tuning (FPT), ...
Apr 14, 2024 · 2024-04-14 更新. FPT: Feature Prompt Tuning for Few-shot Readability Assessment. Authors:Ziyang Wang, Sanwoo Lee, Hsiu-Yuan Huang, ...
In this paper, we employ prompts to improve deep feature representations inspired by the great success of prompt learning (Lee and Lee, 2023; Liu et al., ...