Protest-er: Retraining bert for protest event extraction

Tommaso Caselli, O. Mutlu, Angelo Basile, A. Hürriyetoğlu

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

7 Citations (Scopus)
84 Downloads (Pure)

Abstract

We analyze the effect of further retraining BERT with different domain specific data as an unsupervised domain adaptation strategy for event extraction. Portability of event extraction models is particularly challenging, with large performance drops affecting data on the same text genres (eg, news). We present PROTEST-ER, a retrained BERT model for protest event extraction. PROTEST-ER outperforms a corresponding generic BERT on out-of-domain data of 8.1 points. Our best performing models reach 51.91-46.39 F1 across both domains.
Original languageEnglish
Title of host publicationProceedings of the 4th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021)
EditorsAli Hürriyetoğlu
PublisherAssociation for Computational Linguistics (ACL)
Pages12-19
Number of pages8
DOIs
Publication statusPublished - 2021

Fingerprint

Dive into the research topics of 'Protest-er: Retraining bert for protest event extraction'. Together they form a unique fingerprint.

Cite this