Unsupervised Translation of German–Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language

Lukas Edman, Ahmet Üstün, Antonio Toral Ruiz, Gertjan van Noord

    OnderzoeksoutputAcademicpeer review

    7 Citaten (Scopus)
    63 Downloads (Pure)

    Samenvatting

    This paper describes the methods behind the systems submitted by the University of Groningen for the WMT 2021 Unsupervised Machine Translation task for German–Lower Sorbian (DE–DSB): a high-resource language to a low-resource one. Our system uses a transformer encoder-decoder architecture in which we make three changes to the standard training procedure. First, our training focuses on two languages at a time, contrasting with a wealth of research on multilingual systems. Second, we introduce a novel method for initializing the vocabulary of an unseen language, achieving improvements of 3.2 BLEU for DE->DSB and 4.0 BLEU for DSB->DE.Lastly, we experiment with the order in which offline and online back-translation are used to train an unsupervised system, finding that using online back-translation first works better for DE->DSB by 2.76 BLEU. Our submissions ranked first (tied with another team) for DSB->DE and third for DE->DSB.
    Originele taal-2English
    TitelProceedings of the Sixth Conference on Machine Translation
    UitgeverijAssociation for Computational Linguistics (ACL)
    Pagina's982-988
    Aantal pagina's7
    StatusPublished - 2021
    EvenementSixth Conference on Machine Translation - Online
    Duur: 10-nov.-202111-nov.-2021

    Conference

    ConferenceSixth Conference on Machine Translation
    Periode10/11/202111/11/2021

    Vingerafdruk

    Duik in de onderzoeksthema's van 'Unsupervised Translation of German–Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language'. Samen vormen ze een unieke vingerafdruk.

    Citeer dit