These are not the Stereotypes You are Looking For: Bias and Fairness in Authorial Gender Attribution

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    34 Citations (Scopus)
    265 Downloads (Pure)

    Abstract

    Stylometric and text categorization results show that author gender can be discerned in texts with relatively high accuracy. However, it is difficult to explain what gives rise to these results and there are many possible confounding factors, such as the domain, genre, and target audience of a text. More fundamentally, such classification efforts risk invoking stereotyping and essentialism. We explore this issue in two datasets of Dutch literary novels, using commonly used descriptive (LIWC, topic modeling) and predictive (machine learning) methods. Our results show the importance of controlling for variables in the corpus and we argue for taking care not to overgeneralize from the results.
    Original languageEnglish
    Title of host publicationProceedings of the First Ethics in NLP workshop
    PublisherAssociation for Computational Linguistics (ACL)
    Pages12-22
    Number of pages11
    DOIs
    Publication statusPublished - 2017

    Fingerprint

    Dive into the research topics of 'These are not the Stereotypes You are Looking For: Bias and Fairness in Authorial Gender Attribution'. Together they form a unique fingerprint.

    Cite this