Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer

OnderzoeksoutputAcademicpeer review

50 Downloads (Pure)

Samenvatting

We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.
Originele taal-2English
TitelProceedings of the 60th Annual Meeting of the Association for Computational Linguistics
UitgeverijAssociation for Computational Linguistics, ACL Anthology
StatusE-pub ahead of print - 2022
EvenementThe 60th Annual Meeting of the Association for Computational Linguistics - Dublin, Ireland
Duur: 22-mei-202227-mei-2022

Conference

ConferenceThe 60th Annual Meeting of the Association for Computational Linguistics
Land/RegioIreland
StadDublin
Periode22/05/202227/05/2022

Citeer dit