UDapter: Language Adaptation for Truly Universal Dependency Parsing

    Research output: Contribution to conferencePaperAcademic

    82 Downloads (Pure)


    Recent advances in multilingual dependency parsing have brought the idea of a truly universal parser closer to reality. However, cross-language interference and restrained model capacity remain major obstacles. To address this, we propose a novel multilingual task adaptation approach based on contextual parameter generation and adapter modules. This approach enables to learn adapters via language embeddings while sharing model parameters across languages. It also allows for an easy but effective integration of existing linguistic typology features into the parsing network. The resulting parser, UDapter, outperforms strong monolingual and multilingual baselines on the majority of both high-resource and low-resource (zero-shot) languages, showing the success of the proposed adaptation approach. Our in-depth analyses show that soft parameter sharing via typological features is key to this success.
    Original languageEnglish
    Number of pages14
    Publication statusPublished - Nov-2020
    EventThe 2020 Conference on Empirical Methods in Natural Language Processing -
    Duration: 16-Nov-2020 → …


    ConferenceThe 2020 Conference on Empirical Methods in Natural Language Processing
    Abbreviated titleEMNLP 2020
    Period16/11/2020 → …
    Internet address

    Cite this