Activities per year
Abstract
Peeking into the inner workings of BERT has shown that its layers resemble the classical NLP pipeline, with progressively more complex tasks being concentrated in later layers. To investigate to what extent these results also hold for a language other than English, we probe a Dutch BERT-based model and the multilingual BERT model for Dutch NLP tasks. In addition, through a deeper analysis of part-of-speech tagging, we show that also within a given task, information is spread over different parts of the network and the pipeline might not be as neat as it seems. Each layer has different specialisations, so that it may be more useful to combine information from different layers, instead of selecting a single one based on the best overall performance.
Original language | English |
---|---|
Title of host publication | Findings of the Association for Computational Linguistics |
Subtitle of host publication | EMNLP 2020 |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 4339-4350 |
Number of pages | 12 |
Publication status | Published - 2020 |
Event | Conference on Empirical Methods in Natural Language Processing - Online Duration: 7-Nov-2020 → … |
Conference
Conference | Conference on Empirical Methods in Natural Language Processing |
---|---|
Abbreviated title | EMNLP |
Period | 07/11/2020 → … |
Fingerprint
Dive into the research topics of 'What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models'. Together they form a unique fingerprint.Activities
- 1 Academic presentation
-
BlackboxNLP 2020 workshop collocated with EMNLP 2020
Vries, de, W. (Speaker)
20-Nov-2020Activity: Talk and presentation › Academic presentation › Academic