Overcoming algorithm aversion in personnel selection and admissions decisions

    Research output: ThesisThesis fully internal (DIV)

    185 Downloads (Pure)


    Algorithms are good!

    Due to, among other things, the “Dutch childcare benefits scandal”, algorithms can increasingly count on a lot of criticism. However, algorithms are wrongly to blame; what people put into the algorithm is the problem. In general, algorithm use is a good idea. Unlike humans, they judge everything and everyone in the same, predictable way. This also leads to better results, even when it comes to decisions about unique individuals, where many seemingly complex factors are relevant, such as selecting the best individuals for a job or study. A simple algorithm produces better predictions of future performance than a holistic expert judgement. That has been known for a long time, and yet algorithms are rarely used in selection practice: people have algorithm aversion. This thesis therefore explores what explains algorithm aversion and what makes people more positive about algorithms. A literature review yielded only a few practically useful factors. I have shown that providing training on algorithmic evidence-based decision-making may be useful, but not sufficient; the effect was short-lived. In three studies, increasing autonomy in the design or use of algorithms proved to be an effective solution. If users had some autonomy, they themselves and others were more positive about using algorithms, and the decisions they made were better. Finally, I found that people who are more Conscientious and have more knowledge ‘listen’ better to the advice of algorithms, which also makes better decisions. Do you want to make better decisions? Use a simple algorithm.
    Original languageEnglish
    QualificationDoctor of Philosophy
    Awarding Institution
    • University of Groningen
    • Meijer, Rob, Supervisor
    • Tendeiro, Jorge, Supervisor
    • Niessen, Susan, Co-supervisor
    Award date15-May-2023
    Place of Publication[Groningen]
    Publication statusPublished - 2023

    Cite this