Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages

Wietse de Vries*, Martijn Wieling, Malvina Nissim

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

35 Citations (Scopus)
166 Downloads (Pure)

Abstract

Cross-lingual transfer learning with large multilingual pre-trained models can be an effective approach for low-resource languages with no labeled training data. Existing evaluations of zero-shot cross-lingual generalisability of large pre-trained models use datasets with English training data, and test data in a selection of target languages. We explore a more extensive transfer learning setup with 65 different source languages and 105 target languages for part-of-speech tagging. Through our analysis, we show that pre-training of both source and target language, as well as matching language families, writing systems, word order systems, and lexical-phonetic distance significantly impact cross-lingual performance. The findings described in this paper can be used as indicators of which factors are important for effective zero-shot cross-lingual transfer to zero- and low-resource languages.
Original languageEnglish
Title of host publicationProceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
EditorsSmaranda Muresan, Preslav Nakov, Aline Villavicencio
PublisherAssociation for Computational Linguistics (ACL)
Pages7676-7685
Number of pages10
Publication statusPublished - 2022
EventThe 60th Annual Meeting of the Association for Computational Linguistics - Dublin, Ireland
Duration: 22-May-202227-May-2022

Conference

ConferenceThe 60th Annual Meeting of the Association for Computational Linguistics
Country/TerritoryIreland
CityDublin
Period22/05/202227/05/2022

Fingerprint

Dive into the research topics of 'Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages'. Together they form a unique fingerprint.

Cite this