Nothing Special   »   [go: up one dir, main page]

Selective Reflection-Tuning: Student-Selected Data Recycling for LLM Instruction-Tuning

Ming Li, Lichang Chen, Jiuhai Chen, Shwai He, Jiuxiang Gu, Tianyi Zhou


Abstract
Instruction tuning is critical to large language models (LLMs) for achieving better instruction following and task adaptation capabilities but its success heavily relies on the training data quality. Many recent methods focus on improving the data quality but often overlook the compatibility of the data with the student model being finetuned. This paper introduces Selective Reflection-Tuning, a novel paradigm that synergizes a teacher LLM’s reflection and introspection for improving existing data quality with the data selection capability of the student LLM, to automatically refine existing instruction-tuning data. This teacher-student collaboration produces high-quality and student-compatible instruction-response pairs, resulting in sample-efficient instruction tuning and LLMs of superior performance. Selective Reflection-Tuning is a data augmentation and synthesis that generally improves LLM finetuning and self-improvement without collecting brand-new data. We apply our method to Alpaca and WizardLM data and achieve much stronger and top-tier 7B and 13B LLMs.
Anthology ID:
2024.findings-acl.958
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16189–16211
Language:
URL:
https://aclanthology.org/2024.findings-acl.958
DOI:
10.18653/v1/2024.findings-acl.958
Bibkey:
Cite (ACL):
Ming Li, Lichang Chen, Jiuhai Chen, Shwai He, Jiuxiang Gu, and Tianyi Zhou. 2024. Selective Reflection-Tuning: Student-Selected Data Recycling for LLM Instruction-Tuning. In Findings of the Association for Computational Linguistics: ACL 2024, pages 16189–16211, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Selective Reflection-Tuning: Student-Selected Data Recycling for LLM Instruction-Tuning (Li et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.958.pdf