Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer

Huiyuan Lai, Antonio Toral Ruiz, Malvina Nissim

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

68 Downloads (Pure)

Abstract

We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.
Original languageEnglish
Title of host publicationProceedings of the 60th Annual Meeting of the Association for Computational Linguistics
PublisherAssociation for Computational Linguistics, ACL Anthology
Publication statusE-pub ahead of print - 2022
EventThe 60th Annual Meeting of the Association for Computational Linguistics - Dublin, Ireland
Duration: 22-May-202227-May-2022

Conference

ConferenceThe 60th Annual Meeting of the Association for Computational Linguistics
Country/TerritoryIreland
CityDublin
Period22/05/202227/05/2022

Fingerprint

Dive into the research topics of 'Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer'. Together they form a unique fingerprint.

Cite this