TY - JOUR
T1 - The Murky Waters of Algorithmic Profiling
T2 - Examining discrimination in the digitalized enforcement of social security policy
AU - Haitsma, Lucas
PY - 2023/12/8
Y1 - 2023/12/8
N2 - In the Netherlands, organizations tasked with investigating and sanctioning social security fraud often use algorithmic profiling technologies. Despite the potential benefits, such technologies are not immune to risks, such as discrimination. A lack of clarity in the translation of non-discrimination legislation into practice entails that organizations are tasked with engaging and operationalizing this legislation ex-ante through the identification and mitigation of risks of discrimination. This paper explores the question: How does the use of algorithmic profiling technologies lead to discrimination, and can social security organizations mitigate these risks? In order to answer this question, the theoretical framework of the algorithmic lifecycle is used to identify risks of discrimination in the design, implementation, and use of algorithmic profiling technologies. In particular, semi-structured interviews with experts and the Dutch Childcare Benefits Scandal case are used to identify risks of discrimination and examine how they interact to produce discriminatory outcomes in practice. The findings demonstrate that discrimination is the result of a complex interaction of unmitigated risks throughout the algorithmic lifecycle of profiling technologies. This highlights the importance of taking a lifecycle approach to the identification and mitigation of risks of discrimination.
AB - In the Netherlands, organizations tasked with investigating and sanctioning social security fraud often use algorithmic profiling technologies. Despite the potential benefits, such technologies are not immune to risks, such as discrimination. A lack of clarity in the translation of non-discrimination legislation into practice entails that organizations are tasked with engaging and operationalizing this legislation ex-ante through the identification and mitigation of risks of discrimination. This paper explores the question: How does the use of algorithmic profiling technologies lead to discrimination, and can social security organizations mitigate these risks? In order to answer this question, the theoretical framework of the algorithmic lifecycle is used to identify risks of discrimination in the design, implementation, and use of algorithmic profiling technologies. In particular, semi-structured interviews with experts and the Dutch Childcare Benefits Scandal case are used to identify risks of discrimination and examine how they interact to produce discriminatory outcomes in practice. The findings demonstrate that discrimination is the result of a complex interaction of unmitigated risks throughout the algorithmic lifecycle of profiling technologies. This highlights the importance of taking a lifecycle approach to the identification and mitigation of risks of discrimination.
KW - Discrimination
KW - Algorithmic profiling
KW - Social Security
KW - Childcare Benefits Scandal
KW - Algorithmic lifecycle
U2 - 10.5553/RdW/138064242023044002004
DO - 10.5553/RdW/138064242023044002004
M3 - Article
SN - 1380-6424
VL - 44
SP - 61
EP - 83
JO - Recht der Werkelijkheid
JF - Recht der Werkelijkheid
IS - 2
ER -