Italian Transformers Under the Linguistic Lens

Alessio Miaschi*, Gabriele Sarti, Dominique Brunato, Felice Dell'Orletta, Giulia Venturi

*Corresponding author voor dit werk

OnderzoeksoutputAcademicpeer review

Samenvatting

In this paper we present an in-depth investigation of the linguistic knowledge encoded by the transformer models currently available for the Italian language. In particular, we investigate whether and how using different architectures of probing models affects the performance of Italian transformers in encoding a wide spectrum of linguistic features. Moreover, we explore how this implicit knowledge varies according to different textual genres.
Originele taal-2English
TitelProceedings of the Seventh Italian Conference on Computational Linguistics (CLiC-it 2020)
RedacteurenJohanna Monti, Felice Dell'Orletta, Fabio Tamburini
UitgeverijCEUR Workshop Proceedings (CEUR-WS.org)
StatusPublished - 1-mrt.-2021
Extern gepubliceerdJa
EvenementItalian Conference on Computational Linguistics 2020 - Bologna, Italy
Duur: 1-mrt.-20213-mrt.-2021

Conference

ConferenceItalian Conference on Computational Linguistics 2020
Verkorte titelCLiC-it 2020
Land/RegioItaly
StadBologna
Periode01/03/202103/03/2021

Vingerafdruk

Duik in de onderzoeksthema's van 'Italian Transformers Under the Linguistic Lens'. Samen vormen ze een unieke vingerafdruk.

Citeer dit