TY - UNPB
T1 - Mapping Transformer Leveraged Embeddings for Cross-Lingual Document Representation
AU - Tashu, Tsegaye Misikir
AU - Kontos, Eduard-Raul
AU - Sabatelli, Matthia
AU - Valdenegro-Toro, Matias
PY - 2024/1/12
Y1 - 2024/1/12
N2 - Recommendation systems, for documents, have become tools to find relevant content on the Web. However, these systems have limitations when it comes to recommending documents in languages different from the query language, which means they might overlook resources in non-native languages. This research focuses on representing documents across languages by using Transformer Leveraged Document Representations (TLDRs) that are mapped to a cross-lingual domain. Four multilingual pre-trained transformer models (mBERT, mT5 XLM RoBERTa, ErnieM) were evaluated using three mapping methods across 20 language pairs representing combinations of five selected languages of the European Union. Metrics like Mate Retrieval Rate and Reciprocal Rank were used to measure the effectiveness of mapped TLDRs compared to non-mapped ones. The results highlight the power of cross-lingual representations achieved through pre-trained transformers and mapping approaches suggesting a promising direction for expanding beyond language connections, between two specific languages.
AB - Recommendation systems, for documents, have become tools to find relevant content on the Web. However, these systems have limitations when it comes to recommending documents in languages different from the query language, which means they might overlook resources in non-native languages. This research focuses on representing documents across languages by using Transformer Leveraged Document Representations (TLDRs) that are mapped to a cross-lingual domain. Four multilingual pre-trained transformer models (mBERT, mT5 XLM RoBERTa, ErnieM) were evaluated using three mapping methods across 20 language pairs representing combinations of five selected languages of the European Union. Metrics like Mate Retrieval Rate and Reciprocal Rank were used to measure the effectiveness of mapped TLDRs compared to non-mapped ones. The results highlight the power of cross-lingual representations achieved through pre-trained transformers and mapping approaches suggesting a promising direction for expanding beyond language connections, between two specific languages.
KW - cs.CL
KW - cs.AI
KW - cs.IR
KW - cs.LG
U2 - 10.48550/arXiv.2401.06583
DO - 10.48550/arXiv.2401.06583
M3 - Preprint
BT - Mapping Transformer Leveraged Embeddings for Cross-Lingual Document Representation
PB - arXiv
ER -