Nothing Special   »   [go: up one dir, main page]

Team NTR @ AutoMin 2023: Dolly LLM Improves Minuting Performance, Semantic Segmentation Doesn’t

Eugene Borisov, Nikolay Mikhaylovskiy


Abstract
This paper documents the approach of Team NTR for the Second Shared Task on Automatic Minuting (AutoMin) at INLG 2023. The goal of this work is to develop a module for automatic generation of meeting minutes based on a meeting transcript text produced by an Automated Speech Recognition (ASR) system (Task A). We consider minuting as a supervised machine learning task on pairs of texts: the transcript of the meeting and its minutes. We use a two-staged minuting pipeline that consists of segmentation and summarization. We experiment with semantic segmentation and multi-language approaches and Large Language Model Dolly, and achieve Rouge1-F of 0.2455 and BERT-Score of 0.8063 on the English part of ELITR test set and Rouge1-F of 0.2430 and BERT-Score of 0.8332 on the EuroParl dev set with the submitted Naive Segmentation + Dolly7b pipeline.
Anthology ID:
2023.inlg-genchal.18
Volume:
Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges
Month:
September
Year:
2023
Address:
Prague, Czechia
Editor:
Simon Mille
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
132–137
Language:
URL:
https://aclanthology.org/2023.inlg-genchal.18
DOI:
Bibkey:
Cite (ACL):
Eugene Borisov and Nikolay Mikhaylovskiy. 2023. Team NTR @ AutoMin 2023: Dolly LLM Improves Minuting Performance, Semantic Segmentation Doesn’t. In Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges, pages 132–137, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
Team NTR @ AutoMin 2023: Dolly LLM Improves Minuting Performance, Semantic Segmentation Doesn’t (Borisov & Mikhaylovskiy, INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.inlg-genchal.18.pdf