Abstract
This work explores normalization for
parser adaptation. Traditionally, normalization
is used as separate pre-processing
step. We show that integrating the normalization
model into the parsing algorithm
is beneficial. This way, multiple normalization
candidates can be leveraged,
which improves parsing performance on
social media. We test this hypothesis
by modifying the Berkeley parser; out-ofthe-box
it achieves an F1 score of 66.52.
Our integrated approach reaches a significant
improvement with an F1 score of
67.36, while using the best normalization
sequence results in an F1 score of only
66.94.
parser adaptation. Traditionally, normalization
is used as separate pre-processing
step. We show that integrating the normalization
model into the parsing algorithm
is beneficial. This way, multiple normalization
candidates can be leveraged,
which improves parsing performance on
social media. We test this hypothesis
by modifying the Berkeley parser; out-ofthe-box
it achieves an F1 score of 66.52.
Our integrated approach reaches a significant
improvement with an F1 score of
67.36, while using the best normalization
sequence results in an F1 score of only
66.94.
Original language | English |
---|---|
Pages | 491--497 |
Number of pages | 7 |
Publication status | Published - Jul-2017 |
Event | Association for Computational Linguistics (ACL 2017) - Vancouver, Canada Duration: 5-Aug-2017 → 12-Aug-2017 |
Conference
Conference | Association for Computational Linguistics (ACL 2017) |
---|---|
Period | 05/08/2017 → 12/08/2017 |